Mar 09 13:20:43 crc systemd[1]: Starting Kubernetes Kubelet... Mar 09 13:20:43 crc restorecon[4759]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:43 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:44 crc restorecon[4759]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:44 crc restorecon[4759]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 09 13:20:45 crc kubenswrapper[4764]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 13:20:45 crc kubenswrapper[4764]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 09 13:20:45 crc kubenswrapper[4764]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 13:20:45 crc kubenswrapper[4764]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 13:20:45 crc kubenswrapper[4764]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 09 13:20:45 crc kubenswrapper[4764]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.274066 4764 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282550 4764 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282609 4764 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282615 4764 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282621 4764 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282626 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282632 4764 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282637 4764 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282662 4764 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282668 4764 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282674 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282683 4764 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282690 4764 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282695 4764 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282700 4764 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282704 4764 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282711 4764 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282717 4764 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282724 4764 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282731 4764 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282737 4764 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282745 4764 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282751 4764 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282756 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282761 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282766 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282771 4764 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282777 4764 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282782 4764 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282787 4764 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282791 4764 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282797 4764 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282802 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282807 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282813 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282818 4764 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282823 4764 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282828 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282833 4764 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282837 4764 feature_gate.go:330] unrecognized feature gate: Example Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282842 4764 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282847 4764 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282852 4764 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282856 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282862 4764 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282867 4764 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282872 4764 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282877 4764 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282883 4764 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282889 4764 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282894 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282901 4764 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282907 4764 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282914 4764 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282920 4764 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282926 4764 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282931 4764 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282935 4764 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282939 4764 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282943 4764 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282948 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282952 4764 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282958 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282962 4764 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282968 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282973 4764 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282978 4764 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282982 4764 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282986 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282991 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.282995 4764 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.283001 4764 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283202 4764 flags.go:64] FLAG: --address="0.0.0.0" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283226 4764 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283238 4764 flags.go:64] FLAG: --anonymous-auth="true" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283246 4764 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283258 4764 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283265 4764 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283274 4764 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283282 4764 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283292 4764 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283298 4764 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283304 4764 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283311 4764 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283316 4764 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283322 4764 flags.go:64] FLAG: --cgroup-root="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283328 4764 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283334 4764 flags.go:64] FLAG: --client-ca-file="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283339 4764 flags.go:64] FLAG: --cloud-config="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283345 4764 flags.go:64] FLAG: --cloud-provider="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283350 4764 flags.go:64] FLAG: --cluster-dns="[]" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283359 4764 flags.go:64] FLAG: --cluster-domain="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283365 4764 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283370 4764 flags.go:64] FLAG: --config-dir="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283376 4764 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283382 4764 flags.go:64] FLAG: --container-log-max-files="5" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283391 4764 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283397 4764 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283404 4764 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283410 4764 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283417 4764 flags.go:64] FLAG: --contention-profiling="false" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283424 4764 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283430 4764 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283437 4764 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283443 4764 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283452 4764 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283458 4764 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283463 4764 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283469 4764 flags.go:64] FLAG: --enable-load-reader="false" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283474 4764 flags.go:64] FLAG: --enable-server="true" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283480 4764 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283488 4764 flags.go:64] FLAG: --event-burst="100" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283494 4764 flags.go:64] FLAG: --event-qps="50" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283499 4764 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283505 4764 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283510 4764 flags.go:64] FLAG: --eviction-hard="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283519 4764 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283526 4764 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283532 4764 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283537 4764 flags.go:64] FLAG: --eviction-soft="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283542 4764 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283548 4764 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283553 4764 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283559 4764 flags.go:64] FLAG: --experimental-mounter-path="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283564 4764 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283569 4764 flags.go:64] FLAG: --fail-swap-on="true" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283574 4764 flags.go:64] FLAG: --feature-gates="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283582 4764 flags.go:64] FLAG: --file-check-frequency="20s" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283587 4764 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283593 4764 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283599 4764 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283604 4764 flags.go:64] FLAG: --healthz-port="10248" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283610 4764 flags.go:64] FLAG: --help="false" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283616 4764 flags.go:64] FLAG: --hostname-override="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283621 4764 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283630 4764 flags.go:64] FLAG: --http-check-frequency="20s" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283636 4764 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283667 4764 flags.go:64] FLAG: --image-credential-provider-config="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283674 4764 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283679 4764 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283685 4764 flags.go:64] FLAG: --image-service-endpoint="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283690 4764 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283696 4764 flags.go:64] FLAG: --kube-api-burst="100" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283702 4764 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283708 4764 flags.go:64] FLAG: --kube-api-qps="50" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283713 4764 flags.go:64] FLAG: --kube-reserved="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283719 4764 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283724 4764 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283730 4764 flags.go:64] FLAG: --kubelet-cgroups="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283738 4764 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283744 4764 flags.go:64] FLAG: --lock-file="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283749 4764 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283757 4764 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283763 4764 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283774 4764 flags.go:64] FLAG: --log-json-split-stream="false" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283780 4764 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283786 4764 flags.go:64] FLAG: --log-text-split-stream="false" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283791 4764 flags.go:64] FLAG: --logging-format="text" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283796 4764 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283803 4764 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283809 4764 flags.go:64] FLAG: --manifest-url="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283815 4764 flags.go:64] FLAG: --manifest-url-header="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283824 4764 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283830 4764 flags.go:64] FLAG: --max-open-files="1000000" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283838 4764 flags.go:64] FLAG: --max-pods="110" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283844 4764 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283850 4764 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283857 4764 flags.go:64] FLAG: --memory-manager-policy="None" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283863 4764 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283869 4764 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283874 4764 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283882 4764 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283904 4764 flags.go:64] FLAG: --node-status-max-images="50" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283911 4764 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283918 4764 flags.go:64] FLAG: --oom-score-adj="-999" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283924 4764 flags.go:64] FLAG: --pod-cidr="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283929 4764 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283941 4764 flags.go:64] FLAG: --pod-manifest-path="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283947 4764 flags.go:64] FLAG: --pod-max-pids="-1" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283954 4764 flags.go:64] FLAG: --pods-per-core="0" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283960 4764 flags.go:64] FLAG: --port="10250" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283965 4764 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283971 4764 flags.go:64] FLAG: --provider-id="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283977 4764 flags.go:64] FLAG: --qos-reserved="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283984 4764 flags.go:64] FLAG: --read-only-port="10255" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283990 4764 flags.go:64] FLAG: --register-node="true" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.283995 4764 flags.go:64] FLAG: --register-schedulable="true" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284000 4764 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284013 4764 flags.go:64] FLAG: --registry-burst="10" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284018 4764 flags.go:64] FLAG: --registry-qps="5" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284024 4764 flags.go:64] FLAG: --reserved-cpus="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284029 4764 flags.go:64] FLAG: --reserved-memory="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284037 4764 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284042 4764 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284048 4764 flags.go:64] FLAG: --rotate-certificates="false" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284053 4764 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284059 4764 flags.go:64] FLAG: --runonce="false" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284065 4764 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284071 4764 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284077 4764 flags.go:64] FLAG: --seccomp-default="false" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284082 4764 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284087 4764 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284093 4764 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284099 4764 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284105 4764 flags.go:64] FLAG: --storage-driver-password="root" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284111 4764 flags.go:64] FLAG: --storage-driver-secure="false" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284117 4764 flags.go:64] FLAG: --storage-driver-table="stats" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284122 4764 flags.go:64] FLAG: --storage-driver-user="root" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284128 4764 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284133 4764 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284139 4764 flags.go:64] FLAG: --system-cgroups="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284145 4764 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284153 4764 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284158 4764 flags.go:64] FLAG: --tls-cert-file="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284164 4764 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284173 4764 flags.go:64] FLAG: --tls-min-version="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284179 4764 flags.go:64] FLAG: --tls-private-key-file="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284185 4764 flags.go:64] FLAG: --topology-manager-policy="none" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284190 4764 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284196 4764 flags.go:64] FLAG: --topology-manager-scope="container" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284203 4764 flags.go:64] FLAG: --v="2" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284211 4764 flags.go:64] FLAG: --version="false" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284220 4764 flags.go:64] FLAG: --vmodule="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284228 4764 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.284236 4764 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284375 4764 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284386 4764 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284392 4764 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284398 4764 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284403 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284409 4764 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284414 4764 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284419 4764 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284424 4764 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284429 4764 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284433 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284438 4764 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284443 4764 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284448 4764 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284453 4764 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284458 4764 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284463 4764 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284469 4764 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284474 4764 feature_gate.go:330] unrecognized feature gate: Example Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284479 4764 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284484 4764 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284490 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284494 4764 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284501 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284505 4764 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284510 4764 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284514 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284519 4764 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284525 4764 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284531 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284536 4764 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284540 4764 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284547 4764 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284551 4764 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284556 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284562 4764 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284567 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284572 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284577 4764 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284582 4764 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284586 4764 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284591 4764 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284595 4764 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284600 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284605 4764 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284609 4764 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284614 4764 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284618 4764 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284623 4764 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284628 4764 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284632 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284637 4764 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284663 4764 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284671 4764 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284676 4764 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284681 4764 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284688 4764 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284694 4764 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284699 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284704 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284711 4764 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284715 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284719 4764 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284723 4764 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284728 4764 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284733 4764 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284739 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284744 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284748 4764 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284753 4764 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.284757 4764 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.285455 4764 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.302484 4764 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.302543 4764 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302606 4764 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302614 4764 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302618 4764 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302622 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302626 4764 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302630 4764 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302633 4764 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302637 4764 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302655 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302659 4764 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302663 4764 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302667 4764 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302670 4764 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302673 4764 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302677 4764 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302681 4764 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302686 4764 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302689 4764 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302693 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302696 4764 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302700 4764 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302703 4764 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302707 4764 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302710 4764 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302714 4764 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302719 4764 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302724 4764 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302727 4764 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302731 4764 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302734 4764 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302738 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302741 4764 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302746 4764 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302754 4764 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302759 4764 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302763 4764 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302767 4764 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302771 4764 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302775 4764 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302778 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302782 4764 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302786 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302791 4764 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302795 4764 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302799 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302803 4764 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302806 4764 feature_gate.go:330] unrecognized feature gate: Example Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302810 4764 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302814 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302818 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302821 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302824 4764 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302829 4764 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302832 4764 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302836 4764 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302839 4764 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302843 4764 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302846 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302850 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302853 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302857 4764 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302861 4764 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302864 4764 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302868 4764 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302872 4764 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302876 4764 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302879 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302883 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302886 4764 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302890 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.302894 4764 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.302901 4764 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.303656 4764 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304749 4764 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304760 4764 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304770 4764 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304779 4764 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304787 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304796 4764 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304805 4764 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304813 4764 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304822 4764 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304832 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304840 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304860 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304868 4764 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304876 4764 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304890 4764 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304905 4764 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304917 4764 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304927 4764 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304937 4764 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304947 4764 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304956 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304966 4764 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304975 4764 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.304994 4764 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305004 4764 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305014 4764 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305024 4764 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305033 4764 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305042 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305109 4764 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305142 4764 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305151 4764 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305160 4764 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305168 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305386 4764 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305401 4764 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305410 4764 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305418 4764 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305426 4764 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305434 4764 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305442 4764 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305450 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305457 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305465 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305473 4764 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305481 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305493 4764 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305504 4764 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305512 4764 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305521 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305528 4764 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305536 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305546 4764 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305554 4764 feature_gate.go:330] unrecognized feature gate: Example Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305562 4764 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305570 4764 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305577 4764 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305585 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305593 4764 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305600 4764 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305608 4764 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305616 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305624 4764 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305631 4764 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305670 4764 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305683 4764 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305691 4764 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305700 4764 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305708 4764 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.305718 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.305733 4764 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.306025 4764 server.go:940] "Client rotation is on, will bootstrap in background" Mar 09 13:20:45 crc kubenswrapper[4764]: E0309 13:20:45.315176 4764 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.324460 4764 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.324663 4764 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.332368 4764 server.go:997] "Starting client certificate rotation" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.332434 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.332596 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.377618 4764 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.380003 4764 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 13:20:45 crc kubenswrapper[4764]: E0309 13:20:45.380279 4764 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.52:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.398862 4764 log.go:25] "Validated CRI v1 runtime API" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.440572 4764 log.go:25] "Validated CRI v1 image API" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.442097 4764 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.446503 4764 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-09-13-15-57-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.446552 4764 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.471120 4764 manager.go:217] Machine: {Timestamp:2026-03-09 13:20:45.468265889 +0000 UTC m=+0.718437817 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:470a86bd-e6aa-42c1-b220-b7b8c0289210 BootID:44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:bd:90:da Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:bd:90:da Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:1b:c4:74 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:fb:0f:43 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:fa:6f:28 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ca:a1:48 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:16:d8:db Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ca:f8:b9:bb:96:21 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:be:6f:80:44:37:22 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.471343 4764 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.471482 4764 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.471838 4764 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.472073 4764 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.472104 4764 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.472357 4764 topology_manager.go:138] "Creating topology manager with none policy" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.472367 4764 container_manager_linux.go:303] "Creating device plugin manager" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.472884 4764 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.472913 4764 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.473746 4764 state_mem.go:36] "Initialized new in-memory state store" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.473840 4764 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.477154 4764 kubelet.go:418] "Attempting to sync node with API server" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.477176 4764 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.477197 4764 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.477215 4764 kubelet.go:324] "Adding apiserver pod source" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.477228 4764 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.482879 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.52:6443: connect: connection refused Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.482898 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.52:6443: connect: connection refused Mar 09 13:20:45 crc kubenswrapper[4764]: E0309 13:20:45.482958 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.52:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:45 crc kubenswrapper[4764]: E0309 13:20:45.483000 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.52:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.483279 4764 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.484325 4764 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.487345 4764 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.489495 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.489538 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.489550 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.489559 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.489574 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.489583 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.489592 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.489607 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.489618 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.489627 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.489663 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.489698 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.491006 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.491511 4764 server.go:1280] "Started kubelet" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.491621 4764 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.491831 4764 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.492455 4764 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 09 13:20:45 crc systemd[1]: Started Kubernetes Kubelet. Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.493314 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.52:6443: connect: connection refused Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.493952 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.493995 4764 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 09 13:20:45 crc kubenswrapper[4764]: E0309 13:20:45.494122 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.494154 4764 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.494165 4764 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.494230 4764 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.494865 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.52:6443: connect: connection refused Mar 09 13:20:45 crc kubenswrapper[4764]: E0309 13:20:45.494920 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.52:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.494954 4764 server.go:460] "Adding debug handlers to kubelet server" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.500829 4764 factory.go:153] Registering CRI-O factory Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.500856 4764 factory.go:221] Registration of the crio container factory successfully Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.500965 4764 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.501027 4764 factory.go:55] Registering systemd factory Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.501039 4764 factory.go:221] Registration of the systemd container factory successfully Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.501068 4764 factory.go:103] Registering Raw factory Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.501121 4764 manager.go:1196] Started watching for new ooms in manager Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.502405 4764 manager.go:319] Starting recovery of all containers Mar 09 13:20:45 crc kubenswrapper[4764]: E0309 13:20:45.502342 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.52:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b2ee66c1ac777 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,LastTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:45 crc kubenswrapper[4764]: E0309 13:20:45.504311 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" interval="200ms" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510016 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510066 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510082 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510096 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510108 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510120 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510132 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510146 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510161 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510172 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510183 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510194 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510205 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510220 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510231 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510244 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510256 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510269 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510281 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510293 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510304 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510317 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510329 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510342 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510353 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510365 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510381 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510394 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510406 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510419 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510431 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510474 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510486 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510519 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510534 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510547 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510560 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510572 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510584 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510594 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510637 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510669 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510682 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510695 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510707 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510720 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510733 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510746 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510758 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510803 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510817 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510829 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510847 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510862 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510873 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510887 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510902 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510914 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510926 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510937 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510949 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510960 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510973 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510986 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.510999 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511015 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511026 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511038 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511050 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511063 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511074 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511086 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511098 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511111 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511123 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511135 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511148 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511161 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511172 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511185 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511196 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511208 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511219 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511230 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511240 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511251 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511261 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511273 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511284 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511295 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511305 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511318 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511329 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511340 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511353 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511367 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511381 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511398 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511410 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511423 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511434 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.511446 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514315 4764 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514386 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514412 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514442 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514467 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514487 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514505 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514526 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514547 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514571 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514588 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514605 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514621 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514663 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514681 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514697 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514712 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514728 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514745 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514761 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514778 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514793 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514811 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514828 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514846 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514866 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514883 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514901 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514916 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514935 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514952 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.514986 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515003 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515020 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515037 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515054 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515071 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515086 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515100 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515112 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515126 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515139 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515151 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515163 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515178 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515191 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515204 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515215 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515229 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515241 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515253 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515267 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515283 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515299 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515314 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515331 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515350 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515364 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515376 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515393 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515410 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515426 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515442 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515455 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515467 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515481 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515495 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515511 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515526 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515542 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515557 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515570 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515583 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515598 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515615 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515632 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515674 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515690 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515720 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515738 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515754 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515770 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515786 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515802 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515818 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515835 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515868 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515883 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515900 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515917 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515934 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515951 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515966 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515982 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.515997 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.516012 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.516026 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.516037 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.516051 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.516062 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.516074 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.516085 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.516098 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.516110 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.516122 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.516137 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.516148 4764 reconstruct.go:97] "Volume reconstruction finished" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.516157 4764 reconciler.go:26] "Reconciler: start to sync state" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.519086 4764 manager.go:324] Recovery completed Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.530173 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.532725 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.532760 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.532772 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.536285 4764 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.536303 4764 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.536322 4764 state_mem.go:36] "Initialized new in-memory state store" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.556367 4764 policy_none.go:49] "None policy: Start" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.556624 4764 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.558377 4764 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.558429 4764 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.558456 4764 kubelet.go:2335] "Starting kubelet main sync loop" Mar 09 13:20:45 crc kubenswrapper[4764]: E0309 13:20:45.558521 4764 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.559311 4764 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.559334 4764 state_mem.go:35] "Initializing new in-memory state store" Mar 09 13:20:45 crc kubenswrapper[4764]: W0309 13:20:45.559790 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.52:6443: connect: connection refused Mar 09 13:20:45 crc kubenswrapper[4764]: E0309 13:20:45.559846 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.52:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:45 crc kubenswrapper[4764]: E0309 13:20:45.595102 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.612826 4764 manager.go:334] "Starting Device Plugin manager" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.612871 4764 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.612882 4764 server.go:79] "Starting device plugin registration server" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.613297 4764 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.613307 4764 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.613471 4764 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.613542 4764 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.613549 4764 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 09 13:20:45 crc kubenswrapper[4764]: E0309 13:20:45.620125 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.659377 4764 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.659503 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.660719 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.660780 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.660793 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.661029 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.661217 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.661250 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.662197 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.662222 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.662232 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.662362 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.662487 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.662522 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.662887 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.662914 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.662923 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.663029 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.663043 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.663051 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.663141 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.663179 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.663200 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.663225 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.663346 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.663371 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.664008 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.664051 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.664061 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.664179 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.664291 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.664330 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.664984 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.665002 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.665010 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.665059 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.665092 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.665103 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.665603 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.665633 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.665654 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.665828 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.665856 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.666595 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.666614 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.666630 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:45 crc kubenswrapper[4764]: E0309 13:20:45.705503 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" interval="400ms" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.713701 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.714716 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.714746 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.714755 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.714777 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:20:45 crc kubenswrapper[4764]: E0309 13:20:45.715165 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.52:6443: connect: connection refused" node="crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.717590 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.717620 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.717660 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.717684 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.717735 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.717763 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.717781 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.717800 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.717814 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.717832 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.717847 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.717862 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.717882 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.718027 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.718150 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819286 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819358 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819388 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819411 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819434 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819455 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819476 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819483 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819540 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819555 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819502 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819576 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819612 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819633 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819667 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819746 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819818 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819705 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819752 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819640 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819777 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819930 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.820004 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.820023 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.820043 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819970 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.819680 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.820117 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.820097 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.820212 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.915836 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.918507 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.918550 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.918563 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.918590 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:20:45 crc kubenswrapper[4764]: E0309 13:20:45.919194 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.52:6443: connect: connection refused" node="crc" Mar 09 13:20:45 crc kubenswrapper[4764]: I0309 13:20:45.990713 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 13:20:46 crc kubenswrapper[4764]: I0309 13:20:46.000928 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 09 13:20:46 crc kubenswrapper[4764]: I0309 13:20:46.021182 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:46 crc kubenswrapper[4764]: W0309 13:20:46.039438 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-da90af28d38c787b1e8d35ab13e6ea14131829d7cd34d361bde15842ecd824e9 WatchSource:0}: Error finding container da90af28d38c787b1e8d35ab13e6ea14131829d7cd34d361bde15842ecd824e9: Status 404 returned error can't find the container with id da90af28d38c787b1e8d35ab13e6ea14131829d7cd34d361bde15842ecd824e9 Mar 09 13:20:46 crc kubenswrapper[4764]: W0309 13:20:46.041513 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-a73507a368d0ddb26aa1dbcb9cb86680b25902ee6b245158e8bf1f0a1d4adda5 WatchSource:0}: Error finding container a73507a368d0ddb26aa1dbcb9cb86680b25902ee6b245158e8bf1f0a1d4adda5: Status 404 returned error can't find the container with id a73507a368d0ddb26aa1dbcb9cb86680b25902ee6b245158e8bf1f0a1d4adda5 Mar 09 13:20:46 crc kubenswrapper[4764]: I0309 13:20:46.045004 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:46 crc kubenswrapper[4764]: I0309 13:20:46.052750 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 13:20:46 crc kubenswrapper[4764]: W0309 13:20:46.058690 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-b9d5429a7b7e5aad5f42151da75d5a6bf604089ca37703a5f3f213c921933827 WatchSource:0}: Error finding container b9d5429a7b7e5aad5f42151da75d5a6bf604089ca37703a5f3f213c921933827: Status 404 returned error can't find the container with id b9d5429a7b7e5aad5f42151da75d5a6bf604089ca37703a5f3f213c921933827 Mar 09 13:20:46 crc kubenswrapper[4764]: W0309 13:20:46.071198 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-bdaff8f057864e86b84fd38de29a4925660b611a0113f2005e44ba4d01fb101a WatchSource:0}: Error finding container bdaff8f057864e86b84fd38de29a4925660b611a0113f2005e44ba4d01fb101a: Status 404 returned error can't find the container with id bdaff8f057864e86b84fd38de29a4925660b611a0113f2005e44ba4d01fb101a Mar 09 13:20:46 crc kubenswrapper[4764]: E0309 13:20:46.106697 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" interval="800ms" Mar 09 13:20:46 crc kubenswrapper[4764]: W0309 13:20:46.302787 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.52:6443: connect: connection refused Mar 09 13:20:46 crc kubenswrapper[4764]: E0309 13:20:46.302868 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.52:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:46 crc kubenswrapper[4764]: I0309 13:20:46.320337 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:46 crc kubenswrapper[4764]: I0309 13:20:46.322268 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:46 crc kubenswrapper[4764]: I0309 13:20:46.322301 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:46 crc kubenswrapper[4764]: I0309 13:20:46.322312 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:46 crc kubenswrapper[4764]: I0309 13:20:46.322334 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:20:46 crc kubenswrapper[4764]: E0309 13:20:46.322635 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.52:6443: connect: connection refused" node="crc" Mar 09 13:20:46 crc kubenswrapper[4764]: W0309 13:20:46.369823 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.52:6443: connect: connection refused Mar 09 13:20:46 crc kubenswrapper[4764]: E0309 13:20:46.369898 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.52:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:46 crc kubenswrapper[4764]: I0309 13:20:46.494588 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.52:6443: connect: connection refused Mar 09 13:20:46 crc kubenswrapper[4764]: I0309 13:20:46.563901 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b9d5429a7b7e5aad5f42151da75d5a6bf604089ca37703a5f3f213c921933827"} Mar 09 13:20:46 crc kubenswrapper[4764]: I0309 13:20:46.564998 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3f52b72629c61ca0e26f1963977b231684c94bcbece829ef6ff915bcd30e7447"} Mar 09 13:20:46 crc kubenswrapper[4764]: I0309 13:20:46.566031 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a73507a368d0ddb26aa1dbcb9cb86680b25902ee6b245158e8bf1f0a1d4adda5"} Mar 09 13:20:46 crc kubenswrapper[4764]: I0309 13:20:46.567339 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"da90af28d38c787b1e8d35ab13e6ea14131829d7cd34d361bde15842ecd824e9"} Mar 09 13:20:46 crc kubenswrapper[4764]: I0309 13:20:46.568273 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bdaff8f057864e86b84fd38de29a4925660b611a0113f2005e44ba4d01fb101a"} Mar 09 13:20:46 crc kubenswrapper[4764]: W0309 13:20:46.644078 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.52:6443: connect: connection refused Mar 09 13:20:46 crc kubenswrapper[4764]: E0309 13:20:46.644148 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.52:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:46 crc kubenswrapper[4764]: E0309 13:20:46.907836 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" interval="1.6s" Mar 09 13:20:47 crc kubenswrapper[4764]: W0309 13:20:47.045441 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.52:6443: connect: connection refused Mar 09 13:20:47 crc kubenswrapper[4764]: E0309 13:20:47.045524 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.52:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.122879 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.124537 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.124589 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.124608 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.124669 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:20:47 crc kubenswrapper[4764]: E0309 13:20:47.125219 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.52:6443: connect: connection refused" node="crc" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.494188 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.52:6443: connect: connection refused Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.549143 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 13:20:47 crc kubenswrapper[4764]: E0309 13:20:47.550123 4764 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.52:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.571846 4764 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292" exitCode=0 Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.571936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292"} Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.572027 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.572995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.573030 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.573046 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.577232 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5"} Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.577302 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.577309 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828"} Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.577401 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fb990e0575b0c9fc6674c0828a3e6e8ebaba534f1375aae50660850aefb97d69"} Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.577465 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26"} Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.579364 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.579410 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.579426 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.580625 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3" exitCode=0 Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.580720 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3"} Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.580766 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.581907 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.581937 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.581946 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.582243 4764 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443" exitCode=0 Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.582295 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443"} Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.582409 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.583151 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.583327 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.583368 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.583388 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.583952 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.584038 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.584051 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.584918 4764 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982" exitCode=0 Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.584956 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982"} Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.584985 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.585787 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.585808 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:47 crc kubenswrapper[4764]: I0309 13:20:47.585817 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:48 crc kubenswrapper[4764]: E0309 13:20:48.230943 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.52:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b2ee66c1ac777 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,LastTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.494280 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.52:6443: connect: connection refused Mar 09 13:20:48 crc kubenswrapper[4764]: E0309 13:20:48.509195 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" interval="3.2s" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.593282 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a"} Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.593331 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3"} Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.593342 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344"} Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.593446 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.594366 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.594403 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.594414 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.597868 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"62b986f679483da9ce4281abd51942798908d1dd6876ca39c947d22c8cd8458c"} Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.597901 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.597908 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f"} Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.597923 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b"} Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.597935 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6"} Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.597947 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7"} Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.598472 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.598502 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.598510 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.600402 4764 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1" exitCode=0 Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.600469 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1"} Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.600484 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.601248 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.601400 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.601494 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.603071 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.603396 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.603539 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6"} Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.604109 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.604221 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.604308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.604564 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.604584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.604593 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.726329 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.728110 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.728155 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.728167 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:48 crc kubenswrapper[4764]: I0309 13:20:48.728192 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:20:48 crc kubenswrapper[4764]: E0309 13:20:48.728745 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.52:6443: connect: connection refused" node="crc" Mar 09 13:20:49 crc kubenswrapper[4764]: I0309 13:20:49.608342 4764 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb" exitCode=0 Mar 09 13:20:49 crc kubenswrapper[4764]: I0309 13:20:49.608485 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:49 crc kubenswrapper[4764]: I0309 13:20:49.608531 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 13:20:49 crc kubenswrapper[4764]: I0309 13:20:49.608580 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:49 crc kubenswrapper[4764]: I0309 13:20:49.608614 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:49 crc kubenswrapper[4764]: I0309 13:20:49.608616 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb"} Mar 09 13:20:49 crc kubenswrapper[4764]: I0309 13:20:49.608615 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 13:20:49 crc kubenswrapper[4764]: I0309 13:20:49.608767 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:49 crc kubenswrapper[4764]: I0309 13:20:49.609704 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:49 crc kubenswrapper[4764]: I0309 13:20:49.609771 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:49 crc kubenswrapper[4764]: I0309 13:20:49.609796 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:49 crc kubenswrapper[4764]: I0309 13:20:49.609867 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:49 crc kubenswrapper[4764]: I0309 13:20:49.609892 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:49 crc kubenswrapper[4764]: I0309 13:20:49.609900 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:49 crc kubenswrapper[4764]: I0309 13:20:49.610002 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:49 crc kubenswrapper[4764]: I0309 13:20:49.610015 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:49 crc kubenswrapper[4764]: I0309 13:20:49.610025 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:49 crc kubenswrapper[4764]: I0309 13:20:49.611559 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:49 crc kubenswrapper[4764]: I0309 13:20:49.611617 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:49 crc kubenswrapper[4764]: I0309 13:20:49.611670 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.198064 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.568090 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.568267 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.569485 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.569519 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.569530 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.573806 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.615798 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.615843 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.616319 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.616573 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386"} Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.616599 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed"} Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.616609 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1"} Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.616617 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944"} Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.616625 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2"} Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.616754 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.616950 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.616973 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.616981 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.617126 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.617167 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.617181 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.617448 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.617470 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.617479 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:50 crc kubenswrapper[4764]: I0309 13:20:50.765749 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:51 crc kubenswrapper[4764]: I0309 13:20:51.015281 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:51 crc kubenswrapper[4764]: I0309 13:20:51.617389 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 13:20:51 crc kubenswrapper[4764]: I0309 13:20:51.617411 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 13:20:51 crc kubenswrapper[4764]: I0309 13:20:51.617433 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:51 crc kubenswrapper[4764]: I0309 13:20:51.617453 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:51 crc kubenswrapper[4764]: I0309 13:20:51.617409 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:51 crc kubenswrapper[4764]: I0309 13:20:51.618607 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:51 crc kubenswrapper[4764]: I0309 13:20:51.618631 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:51 crc kubenswrapper[4764]: I0309 13:20:51.618639 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:51 crc kubenswrapper[4764]: I0309 13:20:51.618610 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:51 crc kubenswrapper[4764]: I0309 13:20:51.618876 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:51 crc kubenswrapper[4764]: I0309 13:20:51.618902 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:51 crc kubenswrapper[4764]: I0309 13:20:51.619425 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:51 crc kubenswrapper[4764]: I0309 13:20:51.619457 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:51 crc kubenswrapper[4764]: I0309 13:20:51.619468 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:51 crc kubenswrapper[4764]: I0309 13:20:51.929862 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:51 crc kubenswrapper[4764]: I0309 13:20:51.930898 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:51 crc kubenswrapper[4764]: I0309 13:20:51.930921 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:51 crc kubenswrapper[4764]: I0309 13:20:51.930930 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:51 crc kubenswrapper[4764]: I0309 13:20:51.930948 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:20:51 crc kubenswrapper[4764]: I0309 13:20:51.940568 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 13:20:52 crc kubenswrapper[4764]: I0309 13:20:52.120673 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:52 crc kubenswrapper[4764]: I0309 13:20:52.619949 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:52 crc kubenswrapper[4764]: I0309 13:20:52.620783 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:52 crc kubenswrapper[4764]: I0309 13:20:52.620808 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:52 crc kubenswrapper[4764]: I0309 13:20:52.620816 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:52 crc kubenswrapper[4764]: I0309 13:20:52.917793 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:52 crc kubenswrapper[4764]: I0309 13:20:52.917995 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:52 crc kubenswrapper[4764]: I0309 13:20:52.919022 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:52 crc kubenswrapper[4764]: I0309 13:20:52.919058 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:52 crc kubenswrapper[4764]: I0309 13:20:52.919069 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:53 crc kubenswrapper[4764]: I0309 13:20:53.089503 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:53 crc kubenswrapper[4764]: I0309 13:20:53.111176 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 13:20:53 crc kubenswrapper[4764]: I0309 13:20:53.111303 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:53 crc kubenswrapper[4764]: I0309 13:20:53.112210 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:53 crc kubenswrapper[4764]: I0309 13:20:53.112239 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:53 crc kubenswrapper[4764]: I0309 13:20:53.112248 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:53 crc kubenswrapper[4764]: I0309 13:20:53.492739 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 09 13:20:53 crc kubenswrapper[4764]: I0309 13:20:53.492928 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:53 crc kubenswrapper[4764]: I0309 13:20:53.494100 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:53 crc kubenswrapper[4764]: I0309 13:20:53.494151 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:53 crc kubenswrapper[4764]: I0309 13:20:53.494163 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:53 crc kubenswrapper[4764]: I0309 13:20:53.622130 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:53 crc kubenswrapper[4764]: I0309 13:20:53.623202 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:53 crc kubenswrapper[4764]: I0309 13:20:53.623244 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:53 crc kubenswrapper[4764]: I0309 13:20:53.623253 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:55 crc kubenswrapper[4764]: E0309 13:20:55.620362 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:20:56 crc kubenswrapper[4764]: I0309 13:20:56.089550 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:20:56 crc kubenswrapper[4764]: I0309 13:20:56.089709 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 13:20:59 crc kubenswrapper[4764]: W0309 13:20:59.043004 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.043120 4764 trace.go:236] Trace[767041527]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Mar-2026 13:20:49.041) (total time: 10001ms): Mar 09 13:20:59 crc kubenswrapper[4764]: Trace[767041527]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:20:59.042) Mar 09 13:20:59 crc kubenswrapper[4764]: Trace[767041527]: [10.001687813s] [10.001687813s] END Mar 09 13:20:59 crc kubenswrapper[4764]: E0309 13:20:59.043141 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.082855 4764 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58756->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.082935 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58756->192.168.126.11:17697: read: connection reset by peer" Mar 09 13:20:59 crc kubenswrapper[4764]: W0309 13:20:59.398539 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.398615 4764 trace.go:236] Trace[72767381]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Mar-2026 13:20:49.396) (total time: 10001ms): Mar 09 13:20:59 crc kubenswrapper[4764]: Trace[72767381]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:20:59.398) Mar 09 13:20:59 crc kubenswrapper[4764]: Trace[72767381]: [10.001664318s] [10.001664318s] END Mar 09 13:20:59 crc kubenswrapper[4764]: E0309 13:20:59.398633 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 09 13:20:59 crc kubenswrapper[4764]: E0309 13:20:59.399162 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:59Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b2ee66c1ac777 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,LastTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:59 crc kubenswrapper[4764]: E0309 13:20:59.402381 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:59Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.403395 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:59Z is after 2026-02-23T05:33:13Z Mar 09 13:20:59 crc kubenswrapper[4764]: E0309 13:20:59.405654 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:59Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.408168 4764 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.408213 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 09 13:20:59 crc kubenswrapper[4764]: E0309 13:20:59.408547 4764 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:20:59 crc kubenswrapper[4764]: W0309 13:20:59.408729 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:59Z is after 2026-02-23T05:33:13Z Mar 09 13:20:59 crc kubenswrapper[4764]: E0309 13:20:59.408798 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:20:59 crc kubenswrapper[4764]: W0309 13:20:59.413392 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:59Z is after 2026-02-23T05:33:13Z Mar 09 13:20:59 crc kubenswrapper[4764]: E0309 13:20:59.413450 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.414724 4764 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.414802 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.496784 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:59Z is after 2026-02-23T05:33:13Z Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.637032 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.639263 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="62b986f679483da9ce4281abd51942798908d1dd6876ca39c947d22c8cd8458c" exitCode=255 Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.639350 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"62b986f679483da9ce4281abd51942798908d1dd6876ca39c947d22c8cd8458c"} Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.639834 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.641132 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.641179 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.641199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.642001 4764 scope.go:117] "RemoveContainer" containerID="62b986f679483da9ce4281abd51942798908d1dd6876ca39c947d22c8cd8458c" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.205476 4764 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]log ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]etcd ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/generic-apiserver-start-informers ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/priority-and-fairness-filter ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/start-apiextensions-informers ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/start-apiextensions-controllers ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/crd-informer-synced ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/start-system-namespaces-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 09 13:21:00 crc kubenswrapper[4764]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 09 13:21:00 crc kubenswrapper[4764]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/bootstrap-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/start-kube-aggregator-informers ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/apiservice-registration-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/apiservice-discovery-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]autoregister-completion ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/apiservice-openapi-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: livez check failed Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.206054 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.270013 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.270434 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.272295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.272347 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.272366 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.306150 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.497104 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:00Z is after 2026-02-23T05:33:13Z Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.644134 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.644844 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.646308 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="328b7a569c478d1b319039ad9297eb410278ee0ad8035b6d6a7e45dfc164092d" exitCode=255 Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.646387 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"328b7a569c478d1b319039ad9297eb410278ee0ad8035b6d6a7e45dfc164092d"} Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.646434 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.646452 4764 scope.go:117] "RemoveContainer" containerID="62b986f679483da9ce4281abd51942798908d1dd6876ca39c947d22c8cd8458c" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.646636 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.647388 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.647450 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.647469 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.647975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.648014 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.648026 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.648922 4764 scope.go:117] "RemoveContainer" containerID="328b7a569c478d1b319039ad9297eb410278ee0ad8035b6d6a7e45dfc164092d" Mar 09 13:21:00 crc kubenswrapper[4764]: E0309 13:21:00.649156 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.662335 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 09 13:21:01 crc kubenswrapper[4764]: I0309 13:21:01.498561 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:01Z is after 2026-02-23T05:33:13Z Mar 09 13:21:01 crc kubenswrapper[4764]: I0309 13:21:01.650558 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 13:21:01 crc kubenswrapper[4764]: I0309 13:21:01.653241 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:01 crc kubenswrapper[4764]: I0309 13:21:01.654063 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:01 crc kubenswrapper[4764]: I0309 13:21:01.654105 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:01 crc kubenswrapper[4764]: I0309 13:21:01.654117 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.126276 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.126464 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.127847 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.127916 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.127940 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.499296 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:02Z is after 2026-02-23T05:33:13Z Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.918265 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.918504 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.919638 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.919695 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.919708 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.920278 4764 scope.go:117] "RemoveContainer" containerID="328b7a569c478d1b319039ad9297eb410278ee0ad8035b6d6a7e45dfc164092d" Mar 09 13:21:02 crc kubenswrapper[4764]: E0309 13:21:02.920455 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:03 crc kubenswrapper[4764]: W0309 13:21:03.250798 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:03Z is after 2026-02-23T05:33:13Z Mar 09 13:21:03 crc kubenswrapper[4764]: E0309 13:21:03.250867 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:03 crc kubenswrapper[4764]: I0309 13:21:03.496957 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:03Z is after 2026-02-23T05:33:13Z Mar 09 13:21:03 crc kubenswrapper[4764]: W0309 13:21:03.867540 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:03Z is after 2026-02-23T05:33:13Z Mar 09 13:21:03 crc kubenswrapper[4764]: E0309 13:21:03.867626 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:04 crc kubenswrapper[4764]: W0309 13:21:04.211856 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:04Z is after 2026-02-23T05:33:13Z Mar 09 13:21:04 crc kubenswrapper[4764]: E0309 13:21:04.211924 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:04 crc kubenswrapper[4764]: I0309 13:21:04.499489 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:04Z is after 2026-02-23T05:33:13Z Mar 09 13:21:04 crc kubenswrapper[4764]: W0309 13:21:04.929352 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:04Z is after 2026-02-23T05:33:13Z Mar 09 13:21:04 crc kubenswrapper[4764]: E0309 13:21:04.930817 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.204985 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.205578 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.207002 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.207035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.207044 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.207504 4764 scope.go:117] "RemoveContainer" containerID="328b7a569c478d1b319039ad9297eb410278ee0ad8035b6d6a7e45dfc164092d" Mar 09 13:21:05 crc kubenswrapper[4764]: E0309 13:21:05.207710 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.209960 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.499387 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:05Z is after 2026-02-23T05:33:13Z Mar 09 13:21:05 crc kubenswrapper[4764]: E0309 13:21:05.620510 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.663253 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.664227 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.664260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.664271 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.664931 4764 scope.go:117] "RemoveContainer" containerID="328b7a569c478d1b319039ad9297eb410278ee0ad8035b6d6a7e45dfc164092d" Mar 09 13:21:05 crc kubenswrapper[4764]: E0309 13:21:05.665155 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.805825 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:05 crc kubenswrapper[4764]: E0309 13:21:05.806912 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:05Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.808061 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.808137 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.808159 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.808194 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:21:05 crc kubenswrapper[4764]: E0309 13:21:05.813510 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:05Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 13:21:06 crc kubenswrapper[4764]: I0309 13:21:06.090678 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:21:06 crc kubenswrapper[4764]: I0309 13:21:06.090755 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:21:06 crc kubenswrapper[4764]: I0309 13:21:06.500018 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:06Z is after 2026-02-23T05:33:13Z Mar 09 13:21:07 crc kubenswrapper[4764]: I0309 13:21:07.496587 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:07Z is after 2026-02-23T05:33:13Z Mar 09 13:21:07 crc kubenswrapper[4764]: I0309 13:21:07.913841 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 13:21:07 crc kubenswrapper[4764]: E0309 13:21:07.917877 4764 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:08 crc kubenswrapper[4764]: I0309 13:21:08.468194 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:21:08 crc kubenswrapper[4764]: I0309 13:21:08.468501 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:08 crc kubenswrapper[4764]: I0309 13:21:08.469866 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:08 crc kubenswrapper[4764]: I0309 13:21:08.469900 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:08 crc kubenswrapper[4764]: I0309 13:21:08.469940 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:08 crc kubenswrapper[4764]: I0309 13:21:08.470508 4764 scope.go:117] "RemoveContainer" containerID="328b7a569c478d1b319039ad9297eb410278ee0ad8035b6d6a7e45dfc164092d" Mar 09 13:21:08 crc kubenswrapper[4764]: E0309 13:21:08.470687 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:08 crc kubenswrapper[4764]: I0309 13:21:08.496948 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:08Z is after 2026-02-23T05:33:13Z Mar 09 13:21:09 crc kubenswrapper[4764]: E0309 13:21:09.404487 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:09Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b2ee66c1ac777 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,LastTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:09 crc kubenswrapper[4764]: I0309 13:21:09.497735 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:09Z is after 2026-02-23T05:33:13Z Mar 09 13:21:10 crc kubenswrapper[4764]: I0309 13:21:10.497804 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:10Z is after 2026-02-23T05:33:13Z Mar 09 13:21:10 crc kubenswrapper[4764]: W0309 13:21:10.677170 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:10Z is after 2026-02-23T05:33:13Z Mar 09 13:21:10 crc kubenswrapper[4764]: E0309 13:21:10.677231 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:10Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:11 crc kubenswrapper[4764]: I0309 13:21:11.498261 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:11Z is after 2026-02-23T05:33:13Z Mar 09 13:21:12 crc kubenswrapper[4764]: I0309 13:21:12.496844 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:12Z is after 2026-02-23T05:33:13Z Mar 09 13:21:12 crc kubenswrapper[4764]: E0309 13:21:12.813321 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:12Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 13:21:12 crc kubenswrapper[4764]: I0309 13:21:12.814363 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:12 crc kubenswrapper[4764]: I0309 13:21:12.816299 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:12 crc kubenswrapper[4764]: I0309 13:21:12.816339 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:12 crc kubenswrapper[4764]: I0309 13:21:12.816354 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:12 crc kubenswrapper[4764]: I0309 13:21:12.816382 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:21:12 crc kubenswrapper[4764]: E0309 13:21:12.821306 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:12Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 13:21:12 crc kubenswrapper[4764]: W0309 13:21:12.967400 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:12Z is after 2026-02-23T05:33:13Z Mar 09 13:21:12 crc kubenswrapper[4764]: E0309 13:21:12.967710 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:12Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:13 crc kubenswrapper[4764]: I0309 13:21:13.497839 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:13Z is after 2026-02-23T05:33:13Z Mar 09 13:21:14 crc kubenswrapper[4764]: I0309 13:21:14.497472 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:14Z is after 2026-02-23T05:33:13Z Mar 09 13:21:14 crc kubenswrapper[4764]: W0309 13:21:14.754804 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:14Z is after 2026-02-23T05:33:13Z Mar 09 13:21:14 crc kubenswrapper[4764]: E0309 13:21:14.754948 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:14Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:15 crc kubenswrapper[4764]: I0309 13:21:15.497174 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:15Z is after 2026-02-23T05:33:13Z Mar 09 13:21:15 crc kubenswrapper[4764]: E0309 13:21:15.620731 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.090057 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.090218 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.090431 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.090733 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.092394 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.092445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.092460 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.093080 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"fb990e0575b0c9fc6674c0828a3e6e8ebaba534f1375aae50660850aefb97d69"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.093286 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://fb990e0575b0c9fc6674c0828a3e6e8ebaba534f1375aae50660850aefb97d69" gracePeriod=30 Mar 09 13:21:16 crc kubenswrapper[4764]: W0309 13:21:16.153124 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:16Z is after 2026-02-23T05:33:13Z Mar 09 13:21:16 crc kubenswrapper[4764]: E0309 13:21:16.153291 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.497024 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:16Z is after 2026-02-23T05:33:13Z Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.691115 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.691419 4764 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="fb990e0575b0c9fc6674c0828a3e6e8ebaba534f1375aae50660850aefb97d69" exitCode=255 Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.691761 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"fb990e0575b0c9fc6674c0828a3e6e8ebaba534f1375aae50660850aefb97d69"} Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.691794 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f"} Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.691893 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.692794 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.692818 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.692829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:17 crc kubenswrapper[4764]: I0309 13:21:17.497245 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:17Z is after 2026-02-23T05:33:13Z Mar 09 13:21:18 crc kubenswrapper[4764]: I0309 13:21:18.498290 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:18Z is after 2026-02-23T05:33:13Z Mar 09 13:21:19 crc kubenswrapper[4764]: E0309 13:21:19.409829 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:19Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b2ee66c1ac777 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,LastTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.498786 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:19Z is after 2026-02-23T05:33:13Z Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.559165 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.560696 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.560736 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.560745 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.561266 4764 scope.go:117] "RemoveContainer" containerID="328b7a569c478d1b319039ad9297eb410278ee0ad8035b6d6a7e45dfc164092d" Mar 09 13:21:19 crc kubenswrapper[4764]: E0309 13:21:19.816624 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:19Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.821836 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.822997 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.823025 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.823034 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.823054 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:21:19 crc kubenswrapper[4764]: E0309 13:21:19.825417 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:19Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.496813 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:20Z is after 2026-02-23T05:33:13Z Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.702921 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.704098 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9e2c9fa564beaaf635042afaf0505ac0356f3e639d3ce86340bd1e2df8ced0a3"} Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.704232 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.705090 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.705136 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.705151 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.765831 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.766047 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.767423 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.767463 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.767471 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.497823 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:21Z is after 2026-02-23T05:33:13Z Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.709563 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.710299 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.711949 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9e2c9fa564beaaf635042afaf0505ac0356f3e639d3ce86340bd1e2df8ced0a3" exitCode=255 Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.711983 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9e2c9fa564beaaf635042afaf0505ac0356f3e639d3ce86340bd1e2df8ced0a3"} Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.712017 4764 scope.go:117] "RemoveContainer" containerID="328b7a569c478d1b319039ad9297eb410278ee0ad8035b6d6a7e45dfc164092d" Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.712147 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.713370 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.713400 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.713409 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.713906 4764 scope.go:117] "RemoveContainer" containerID="9e2c9fa564beaaf635042afaf0505ac0356f3e639d3ce86340bd1e2df8ced0a3" Mar 09 13:21:21 crc kubenswrapper[4764]: E0309 13:21:21.714067 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:22 crc kubenswrapper[4764]: I0309 13:21:22.496725 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:22Z is after 2026-02-23T05:33:13Z Mar 09 13:21:22 crc kubenswrapper[4764]: I0309 13:21:22.715421 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 13:21:22 crc kubenswrapper[4764]: I0309 13:21:22.918823 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:21:22 crc kubenswrapper[4764]: I0309 13:21:22.919128 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:22 crc kubenswrapper[4764]: I0309 13:21:22.920771 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:22 crc kubenswrapper[4764]: I0309 13:21:22.920841 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:22 crc kubenswrapper[4764]: I0309 13:21:22.920854 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:22 crc kubenswrapper[4764]: I0309 13:21:22.921615 4764 scope.go:117] "RemoveContainer" containerID="9e2c9fa564beaaf635042afaf0505ac0356f3e639d3ce86340bd1e2df8ced0a3" Mar 09 13:21:22 crc kubenswrapper[4764]: E0309 13:21:22.921877 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:23 crc kubenswrapper[4764]: I0309 13:21:23.089624 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:21:23 crc kubenswrapper[4764]: I0309 13:21:23.089838 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:23 crc kubenswrapper[4764]: I0309 13:21:23.091172 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:23 crc kubenswrapper[4764]: I0309 13:21:23.091227 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:23 crc kubenswrapper[4764]: I0309 13:21:23.091241 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:23 crc kubenswrapper[4764]: I0309 13:21:23.496702 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:23Z is after 2026-02-23T05:33:13Z Mar 09 13:21:24 crc kubenswrapper[4764]: I0309 13:21:24.497415 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:24Z is after 2026-02-23T05:33:13Z Mar 09 13:21:24 crc kubenswrapper[4764]: I0309 13:21:24.833345 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 13:21:24 crc kubenswrapper[4764]: E0309 13:21:24.838089 4764 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:24 crc kubenswrapper[4764]: E0309 13:21:24.839394 4764 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 09 13:21:25 crc kubenswrapper[4764]: I0309 13:21:25.497076 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:25Z is after 2026-02-23T05:33:13Z Mar 09 13:21:25 crc kubenswrapper[4764]: E0309 13:21:25.621012 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:21:26 crc kubenswrapper[4764]: I0309 13:21:26.090111 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:21:26 crc kubenswrapper[4764]: I0309 13:21:26.090195 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:21:26 crc kubenswrapper[4764]: I0309 13:21:26.496889 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:26Z is after 2026-02-23T05:33:13Z Mar 09 13:21:26 crc kubenswrapper[4764]: E0309 13:21:26.822718 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:26Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 13:21:26 crc kubenswrapper[4764]: I0309 13:21:26.825840 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:26 crc kubenswrapper[4764]: I0309 13:21:26.827603 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:26 crc kubenswrapper[4764]: I0309 13:21:26.827751 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:26 crc kubenswrapper[4764]: I0309 13:21:26.827772 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:26 crc kubenswrapper[4764]: I0309 13:21:26.827822 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:21:26 crc kubenswrapper[4764]: E0309 13:21:26.831058 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:26Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 13:21:27 crc kubenswrapper[4764]: I0309 13:21:27.497748 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:27Z is after 2026-02-23T05:33:13Z Mar 09 13:21:28 crc kubenswrapper[4764]: I0309 13:21:28.468066 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:21:28 crc kubenswrapper[4764]: I0309 13:21:28.468268 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:28 crc kubenswrapper[4764]: I0309 13:21:28.469524 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:28 crc kubenswrapper[4764]: I0309 13:21:28.469588 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:28 crc kubenswrapper[4764]: I0309 13:21:28.469607 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:28 crc kubenswrapper[4764]: I0309 13:21:28.470527 4764 scope.go:117] "RemoveContainer" containerID="9e2c9fa564beaaf635042afaf0505ac0356f3e639d3ce86340bd1e2df8ced0a3" Mar 09 13:21:28 crc kubenswrapper[4764]: E0309 13:21:28.470849 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:28 crc kubenswrapper[4764]: I0309 13:21:28.498918 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:28Z is after 2026-02-23T05:33:13Z Mar 09 13:21:28 crc kubenswrapper[4764]: W0309 13:21:28.513098 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:28Z is after 2026-02-23T05:33:13Z Mar 09 13:21:28 crc kubenswrapper[4764]: E0309 13:21:28.513173 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:29 crc kubenswrapper[4764]: E0309 13:21:29.413698 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:29Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b2ee66c1ac777 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,LastTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:29 crc kubenswrapper[4764]: I0309 13:21:29.499973 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:29Z is after 2026-02-23T05:33:13Z Mar 09 13:21:29 crc kubenswrapper[4764]: W0309 13:21:29.679123 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:29Z is after 2026-02-23T05:33:13Z Mar 09 13:21:29 crc kubenswrapper[4764]: E0309 13:21:29.679249 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:30 crc kubenswrapper[4764]: I0309 13:21:30.497405 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:30Z is after 2026-02-23T05:33:13Z Mar 09 13:21:31 crc kubenswrapper[4764]: I0309 13:21:31.496673 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:31Z is after 2026-02-23T05:33:13Z Mar 09 13:21:32 crc kubenswrapper[4764]: I0309 13:21:32.497062 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:32Z is after 2026-02-23T05:33:13Z Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.118692 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.118863 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.119987 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.120028 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.120038 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.497696 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:33Z is after 2026-02-23T05:33:13Z Mar 09 13:21:33 crc kubenswrapper[4764]: E0309 13:21:33.828292 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:33Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.831315 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.832466 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.832498 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.832564 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.832586 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:21:33 crc kubenswrapper[4764]: E0309 13:21:33.836398 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:33Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 13:21:34 crc kubenswrapper[4764]: W0309 13:21:34.391418 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:34Z is after 2026-02-23T05:33:13Z Mar 09 13:21:34 crc kubenswrapper[4764]: E0309 13:21:34.391521 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:34 crc kubenswrapper[4764]: I0309 13:21:34.497083 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:34Z is after 2026-02-23T05:33:13Z Mar 09 13:21:35 crc kubenswrapper[4764]: I0309 13:21:35.497123 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:35Z is after 2026-02-23T05:33:13Z Mar 09 13:21:35 crc kubenswrapper[4764]: E0309 13:21:35.621760 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:21:36 crc kubenswrapper[4764]: I0309 13:21:36.090289 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:21:36 crc kubenswrapper[4764]: I0309 13:21:36.090410 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:21:36 crc kubenswrapper[4764]: I0309 13:21:36.500311 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:36Z is after 2026-02-23T05:33:13Z Mar 09 13:21:37 crc kubenswrapper[4764]: I0309 13:21:37.497053 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:37Z is after 2026-02-23T05:33:13Z Mar 09 13:21:37 crc kubenswrapper[4764]: W0309 13:21:37.556076 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:37Z is after 2026-02-23T05:33:13Z Mar 09 13:21:37 crc kubenswrapper[4764]: E0309 13:21:37.556141 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:38 crc kubenswrapper[4764]: I0309 13:21:38.496435 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:38Z is after 2026-02-23T05:33:13Z Mar 09 13:21:39 crc kubenswrapper[4764]: E0309 13:21:39.420049 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b2ee66c1ac777 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,LastTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:39 crc kubenswrapper[4764]: I0309 13:21:39.497080 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2026-02-23T05:33:13Z Mar 09 13:21:40 crc kubenswrapper[4764]: I0309 13:21:40.502309 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:40 crc kubenswrapper[4764]: I0309 13:21:40.836993 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:40 crc kubenswrapper[4764]: I0309 13:21:40.838971 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:40 crc kubenswrapper[4764]: I0309 13:21:40.839049 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:40 crc kubenswrapper[4764]: I0309 13:21:40.839079 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:40 crc kubenswrapper[4764]: I0309 13:21:40.839133 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:21:40 crc kubenswrapper[4764]: E0309 13:21:40.839723 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 13:21:40 crc kubenswrapper[4764]: E0309 13:21:40.846128 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 13:21:41 crc kubenswrapper[4764]: I0309 13:21:41.498100 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:42 crc kubenswrapper[4764]: I0309 13:21:42.499298 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.497777 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.559463 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.560441 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.560471 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.560481 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.561699 4764 scope.go:117] "RemoveContainer" containerID="9e2c9fa564beaaf635042afaf0505ac0356f3e639d3ce86340bd1e2df8ced0a3" Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.772179 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.774309 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356"} Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.774436 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.775248 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.775291 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.775304 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.500349 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.777931 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.779228 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.781458 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356" exitCode=255 Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.781497 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356"} Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.781555 4764 scope.go:117] "RemoveContainer" containerID="9e2c9fa564beaaf635042afaf0505ac0356f3e639d3ce86340bd1e2df8ced0a3" Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.781721 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.782628 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.782698 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.782708 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.783665 4764 scope.go:117] "RemoveContainer" containerID="033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356" Mar 09 13:21:44 crc kubenswrapper[4764]: E0309 13:21:44.783889 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:45 crc kubenswrapper[4764]: I0309 13:21:45.500197 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:45 crc kubenswrapper[4764]: E0309 13:21:45.621981 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:21:45 crc kubenswrapper[4764]: I0309 13:21:45.787554 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.089985 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.090440 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.090751 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.091197 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.093437 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.093477 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.093487 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.094051 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.094173 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f" gracePeriod=30 Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.497493 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.797279 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.798361 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.798680 4764 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f" exitCode=255 Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.798719 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f"} Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.798759 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788"} Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.798779 4764 scope.go:117] "RemoveContainer" containerID="fb990e0575b0c9fc6674c0828a3e6e8ebaba534f1375aae50660850aefb97d69" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.798886 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.799807 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.799838 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.799849 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:47 crc kubenswrapper[4764]: I0309 13:21:47.498897 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:47 crc kubenswrapper[4764]: I0309 13:21:47.803802 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 13:21:47 crc kubenswrapper[4764]: I0309 13:21:47.846441 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:47 crc kubenswrapper[4764]: E0309 13:21:47.847074 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 13:21:47 crc kubenswrapper[4764]: I0309 13:21:47.848446 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:47 crc kubenswrapper[4764]: I0309 13:21:47.848491 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:47 crc kubenswrapper[4764]: I0309 13:21:47.848504 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:47 crc kubenswrapper[4764]: I0309 13:21:47.848536 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:21:47 crc kubenswrapper[4764]: E0309 13:21:47.855240 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 13:21:48 crc kubenswrapper[4764]: I0309 13:21:48.467906 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:21:48 crc kubenswrapper[4764]: I0309 13:21:48.468163 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:48 crc kubenswrapper[4764]: I0309 13:21:48.469804 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:48 crc kubenswrapper[4764]: I0309 13:21:48.469847 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:48 crc kubenswrapper[4764]: I0309 13:21:48.469864 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:48 crc kubenswrapper[4764]: I0309 13:21:48.470582 4764 scope.go:117] "RemoveContainer" containerID="033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356" Mar 09 13:21:48 crc kubenswrapper[4764]: E0309 13:21:48.470824 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:48 crc kubenswrapper[4764]: I0309 13:21:48.500216 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.426018 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66c1ac777 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,LastTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.429845 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90ae28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532753448 +0000 UTC m=+0.782925356,LastTimestamp:2026-03-09 13:20:45.532753448 +0000 UTC m=+0.782925356,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.434856 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90e367 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532767079 +0000 UTC m=+0.782938987,LastTimestamp:2026-03-09 13:20:45.532767079 +0000 UTC m=+0.782938987,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.439063 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e910e0f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532777999 +0000 UTC m=+0.782949907,LastTimestamp:2026-03-09 13:20:45.532777999 +0000 UTC m=+0.782949907,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.443318 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee6739ccec1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.617434305 +0000 UTC m=+0.867606213,LastTimestamp:2026-03-09 13:20:45.617434305 +0000 UTC m=+0.867606213,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.447625 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90ae28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90ae28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532753448 +0000 UTC m=+0.782925356,LastTimestamp:2026-03-09 13:20:45.660763503 +0000 UTC m=+0.910935411,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.451248 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90e367\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90e367 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532767079 +0000 UTC m=+0.782938987,LastTimestamp:2026-03-09 13:20:45.660789644 +0000 UTC m=+0.910961552,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.455530 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e910e0f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e910e0f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532777999 +0000 UTC m=+0.782949907,LastTimestamp:2026-03-09 13:20:45.660799444 +0000 UTC m=+0.910971352,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.459188 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90ae28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90ae28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532753448 +0000 UTC m=+0.782925356,LastTimestamp:2026-03-09 13:20:45.662210973 +0000 UTC m=+0.912382881,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.461709 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90e367\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90e367 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532767079 +0000 UTC m=+0.782938987,LastTimestamp:2026-03-09 13:20:45.662228033 +0000 UTC m=+0.912399941,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.463307 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e910e0f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e910e0f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532777999 +0000 UTC m=+0.782949907,LastTimestamp:2026-03-09 13:20:45.662237653 +0000 UTC m=+0.912409561,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.465550 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90ae28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90ae28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532753448 +0000 UTC m=+0.782925356,LastTimestamp:2026-03-09 13:20:45.662904587 +0000 UTC m=+0.913076495,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.467717 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90e367\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90e367 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532767079 +0000 UTC m=+0.782938987,LastTimestamp:2026-03-09 13:20:45.662920297 +0000 UTC m=+0.913092195,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.472163 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e910e0f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e910e0f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532777999 +0000 UTC m=+0.782949907,LastTimestamp:2026-03-09 13:20:45.662928328 +0000 UTC m=+0.913100236,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.477270 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90ae28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90ae28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532753448 +0000 UTC m=+0.782925356,LastTimestamp:2026-03-09 13:20:45.66303954 +0000 UTC m=+0.913211448,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.485833 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90e367\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90e367 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532767079 +0000 UTC m=+0.782938987,LastTimestamp:2026-03-09 13:20:45.66304797 +0000 UTC m=+0.913219878,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.492777 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e910e0f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e910e0f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532777999 +0000 UTC m=+0.782949907,LastTimestamp:2026-03-09 13:20:45.66305535 +0000 UTC m=+0.913227258,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: I0309 13:21:49.497553 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.497894 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90ae28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90ae28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532753448 +0000 UTC m=+0.782925356,LastTimestamp:2026-03-09 13:20:45.663195463 +0000 UTC m=+0.913367371,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.511343 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90e367\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90e367 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532767079 +0000 UTC m=+0.782938987,LastTimestamp:2026-03-09 13:20:45.663206333 +0000 UTC m=+0.913378241,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.517253 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e910e0f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e910e0f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532777999 +0000 UTC m=+0.782949907,LastTimestamp:2026-03-09 13:20:45.663230864 +0000 UTC m=+0.913402772,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.521909 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90ae28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90ae28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532753448 +0000 UTC m=+0.782925356,LastTimestamp:2026-03-09 13:20:45.664022 +0000 UTC m=+0.914193908,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.525905 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90e367\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90e367 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532767079 +0000 UTC m=+0.782938987,LastTimestamp:2026-03-09 13:20:45.664058001 +0000 UTC m=+0.914229899,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.530127 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e910e0f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e910e0f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532777999 +0000 UTC m=+0.782949907,LastTimestamp:2026-03-09 13:20:45.664065361 +0000 UTC m=+0.914237269,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.534960 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90ae28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90ae28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532753448 +0000 UTC m=+0.782925356,LastTimestamp:2026-03-09 13:20:45.66499769 +0000 UTC m=+0.915169598,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.540921 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90e367\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90e367 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532767079 +0000 UTC m=+0.782938987,LastTimestamp:2026-03-09 13:20:45.66500745 +0000 UTC m=+0.915179358,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.546040 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee68d8d4ac9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.052625097 +0000 UTC m=+1.302797005,LastTimestamp:2026-03-09 13:20:46.052625097 +0000 UTC m=+1.302797005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.550135 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee68d8de552 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.052664658 +0000 UTC m=+1.302836566,LastTimestamp:2026-03-09 13:20:46.052664658 +0000 UTC m=+1.302836566,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.554040 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2ee68d8d7bf7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.052637687 +0000 UTC m=+1.302809635,LastTimestamp:2026-03-09 13:20:46.052637687 +0000 UTC m=+1.302809635,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.557572 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee68e94934f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.069879631 +0000 UTC m=+1.320051549,LastTimestamp:2026-03-09 13:20:46.069879631 +0000 UTC m=+1.320051549,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.568909 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee68ed8357d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.074312061 +0000 UTC m=+1.324483990,LastTimestamp:2026-03-09 13:20:46.074312061 +0000 UTC m=+1.324483990,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.572400 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2ee6b427710e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.700261646 +0000 UTC m=+1.950433554,LastTimestamp:2026-03-09 13:20:46.700261646 +0000 UTC m=+1.950433554,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.575894 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6b429075a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.700365658 +0000 UTC m=+1.950537566,LastTimestamp:2026-03-09 13:20:46.700365658 +0000 UTC m=+1.950537566,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.580240 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee6b42a5c6a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.70045297 +0000 UTC m=+1.950624878,LastTimestamp:2026-03-09 13:20:46.70045297 +0000 UTC m=+1.950624878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.586499 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee6b42ade38 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.7004862 +0000 UTC m=+1.950658108,LastTimestamp:2026-03-09 13:20:46.7004862 +0000 UTC m=+1.950658108,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.590918 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee6b42b40e5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.700511461 +0000 UTC m=+1.950683369,LastTimestamp:2026-03-09 13:20:46.700511461 +0000 UTC m=+1.950683369,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.594847 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee6b578e8bf openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.722377919 +0000 UTC m=+1.972549827,LastTimestamp:2026-03-09 13:20:46.722377919 +0000 UTC m=+1.972549827,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.600233 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee6b5b56cfb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.726343931 +0000 UTC m=+1.976515839,LastTimestamp:2026-03-09 13:20:46.726343931 +0000 UTC m=+1.976515839,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.606316 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6b5b6d588 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.726436232 +0000 UTC m=+1.976608140,LastTimestamp:2026-03-09 13:20:46.726436232 +0000 UTC m=+1.976608140,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.609913 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee6b5b75e4b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.726471243 +0000 UTC m=+1.976643151,LastTimestamp:2026-03-09 13:20:46.726471243 +0000 UTC m=+1.976643151,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.613499 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2ee6b5bce909 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.726834441 +0000 UTC m=+1.977006339,LastTimestamp:2026-03-09 13:20:46.726834441 +0000 UTC m=+1.977006339,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.617938 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6b5cb3b48 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.727773 +0000 UTC m=+1.977944908,LastTimestamp:2026-03-09 13:20:46.727773 +0000 UTC m=+1.977944908,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.621242 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6ca74dc64 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.074434148 +0000 UTC m=+2.324606056,LastTimestamp:2026-03-09 13:20:47.074434148 +0000 UTC m=+2.324606056,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.624592 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6cb076563 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.084037475 +0000 UTC m=+2.334209383,LastTimestamp:2026-03-09 13:20:47.084037475 +0000 UTC m=+2.334209383,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.627802 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6cb16efb6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.085055926 +0000 UTC m=+2.335227834,LastTimestamp:2026-03-09 13:20:47.085055926 +0000 UTC m=+2.335227834,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.633996 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6d595366d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.261103725 +0000 UTC m=+2.511275653,LastTimestamp:2026-03-09 13:20:47.261103725 +0000 UTC m=+2.511275653,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.637559 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6d647f4f2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.272817906 +0000 UTC m=+2.522989824,LastTimestamp:2026-03-09 13:20:47.272817906 +0000 UTC m=+2.522989824,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.641018 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6d65bd7ca openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.274121162 +0000 UTC m=+2.524293080,LastTimestamp:2026-03-09 13:20:47.274121162 +0000 UTC m=+2.524293080,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.644316 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6e07a4c29 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.443889193 +0000 UTC m=+2.694061101,LastTimestamp:2026-03-09 13:20:47.443889193 +0000 UTC m=+2.694061101,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.648503 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6e1b42eb3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.464459955 +0000 UTC m=+2.714631873,LastTimestamp:2026-03-09 13:20:47.464459955 +0000 UTC m=+2.714631873,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.652387 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee6e8616152 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.576473938 +0000 UTC m=+2.826645846,LastTimestamp:2026-03-09 13:20:47.576473938 +0000 UTC m=+2.826645846,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.656298 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee6e8c58cd7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.583038679 +0000 UTC m=+2.833210587,LastTimestamp:2026-03-09 13:20:47.583038679 +0000 UTC m=+2.833210587,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.660018 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee6e9032c5a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.58707721 +0000 UTC m=+2.837249118,LastTimestamp:2026-03-09 13:20:47.58707721 +0000 UTC m=+2.837249118,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.663166 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2ee6e90d5c7f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.587744895 +0000 UTC m=+2.837916813,LastTimestamp:2026-03-09 13:20:47.587744895 +0000 UTC m=+2.837916813,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.666386 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee6f446c9ce openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.776057806 +0000 UTC m=+3.026229704,LastTimestamp:2026-03-09 13:20:47.776057806 +0000 UTC m=+3.026229704,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.669621 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee6f47996d4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.779387092 +0000 UTC m=+3.029558990,LastTimestamp:2026-03-09 13:20:47.779387092 +0000 UTC m=+3.029558990,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.672735 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee6f47acc02 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.779466242 +0000 UTC m=+3.029638150,LastTimestamp:2026-03-09 13:20:47.779466242 +0000 UTC m=+3.029638150,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.677097 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2ee6f47b9557 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.779517783 +0000 UTC m=+3.029689691,LastTimestamp:2026-03-09 13:20:47.779517783 +0000 UTC m=+3.029689691,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.680978 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee6f4e69bfd openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.786531837 +0000 UTC m=+3.036703745,LastTimestamp:2026-03-09 13:20:47.786531837 +0000 UTC m=+3.036703745,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.684889 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee6f4fdb55a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.788045658 +0000 UTC m=+3.038217566,LastTimestamp:2026-03-09 13:20:47.788045658 +0000 UTC m=+3.038217566,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.689326 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee6f5a52033 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.799017523 +0000 UTC m=+3.049189431,LastTimestamp:2026-03-09 13:20:47.799017523 +0000 UTC m=+3.049189431,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.693618 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee6f5b44c7d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.800011901 +0000 UTC m=+3.050183809,LastTimestamp:2026-03-09 13:20:47.800011901 +0000 UTC m=+3.050183809,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.698605 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2ee6f5fd87b4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.804811188 +0000 UTC m=+3.054983096,LastTimestamp:2026-03-09 13:20:47.804811188 +0000 UTC m=+3.054983096,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.702334 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee6f6b79f2c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.817006892 +0000 UTC m=+3.067178800,LastTimestamp:2026-03-09 13:20:47.817006892 +0000 UTC m=+3.067178800,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.705770 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee6feabb9c8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.950445 +0000 UTC m=+3.200616908,LastTimestamp:2026-03-09 13:20:47.950445 +0000 UTC m=+3.200616908,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.709690 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee6fec32ec0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.951982272 +0000 UTC m=+3.202154180,LastTimestamp:2026-03-09 13:20:47.951982272 +0000 UTC m=+3.202154180,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.713081 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee6ff576936 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.961696566 +0000 UTC m=+3.211868474,LastTimestamp:2026-03-09 13:20:47.961696566 +0000 UTC m=+3.211868474,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.716725 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee6ff6870b3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.962812595 +0000 UTC m=+3.212984513,LastTimestamp:2026-03-09 13:20:47.962812595 +0000 UTC m=+3.212984513,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.721330 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee6ffbb9367 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.968260967 +0000 UTC m=+3.218432875,LastTimestamp:2026-03-09 13:20:47.968260967 +0000 UTC m=+3.218432875,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.725224 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee6ffcc72e4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.969366756 +0000 UTC m=+3.219538664,LastTimestamp:2026-03-09 13:20:47.969366756 +0000 UTC m=+3.219538664,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.729103 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee7098eab6d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.133090157 +0000 UTC m=+3.383262065,LastTimestamp:2026-03-09 13:20:48.133090157 +0000 UTC m=+3.383262065,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.733097 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee7099595f3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.133543411 +0000 UTC m=+3.383715329,LastTimestamp:2026-03-09 13:20:48.133543411 +0000 UTC m=+3.383715329,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.736708 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee70a6bc347 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.147579719 +0000 UTC m=+3.397751637,LastTimestamp:2026-03-09 13:20:48.147579719 +0000 UTC m=+3.397751637,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.740735 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee70a91c514 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.150070548 +0000 UTC m=+3.400242466,LastTimestamp:2026-03-09 13:20:48.150070548 +0000 UTC m=+3.400242466,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.744011 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee70aa27556 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.151164246 +0000 UTC m=+3.401336154,LastTimestamp:2026-03-09 13:20:48.151164246 +0000 UTC m=+3.401336154,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.747398 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee7154fe183 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.330301827 +0000 UTC m=+3.580473735,LastTimestamp:2026-03-09 13:20:48.330301827 +0000 UTC m=+3.580473735,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.750750 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee715fff996 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.341842326 +0000 UTC m=+3.592014244,LastTimestamp:2026-03-09 13:20:48.341842326 +0000 UTC m=+3.592014244,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.754091 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee7161286eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.343058155 +0000 UTC m=+3.593230073,LastTimestamp:2026-03-09 13:20:48.343058155 +0000 UTC m=+3.593230073,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.757282 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee71f7d4291 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.501047953 +0000 UTC m=+3.751219871,LastTimestamp:2026-03-09 13:20:48.501047953 +0000 UTC m=+3.751219871,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.761159 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee7201be9c3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.511445443 +0000 UTC m=+3.761617351,LastTimestamp:2026-03-09 13:20:48.511445443 +0000 UTC m=+3.761617351,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.764788 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee72590fcd6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.603004118 +0000 UTC m=+3.853176026,LastTimestamp:2026-03-09 13:20:48.603004118 +0000 UTC m=+3.853176026,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.768402 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee730b5d425 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.789967909 +0000 UTC m=+4.040139827,LastTimestamp:2026-03-09 13:20:48.789967909 +0000 UTC m=+4.040139827,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.772148 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee7320a4493 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.812278931 +0000 UTC m=+4.062450849,LastTimestamp:2026-03-09 13:20:48.812278931 +0000 UTC m=+4.062450849,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.777150 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee761ac119c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:49.611411868 +0000 UTC m=+4.861583776,LastTimestamp:2026-03-09 13:20:49.611411868 +0000 UTC m=+4.861583776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.780909 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee76b06f41d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:49.768363037 +0000 UTC m=+5.018534945,LastTimestamp:2026-03-09 13:20:49.768363037 +0000 UTC m=+5.018534945,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.783962 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee76b880026 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:49.776820262 +0000 UTC m=+5.026992170,LastTimestamp:2026-03-09 13:20:49.776820262 +0000 UTC m=+5.026992170,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.787213 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee76b9749e4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:49.77782218 +0000 UTC m=+5.027994088,LastTimestamp:2026-03-09 13:20:49.77782218 +0000 UTC m=+5.027994088,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.790464 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee774e9e7e5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:49.934231525 +0000 UTC m=+5.184403473,LastTimestamp:2026-03-09 13:20:49.934231525 +0000 UTC m=+5.184403473,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.793756 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee775959394 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:49.945482132 +0000 UTC m=+5.195654050,LastTimestamp:2026-03-09 13:20:49.945482132 +0000 UTC m=+5.195654050,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.797450 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee775a4a94a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:49.94647073 +0000 UTC m=+5.196642648,LastTimestamp:2026-03-09 13:20:49.94647073 +0000 UTC m=+5.196642648,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.801006 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee78113039b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:50.138252187 +0000 UTC m=+5.388424095,LastTimestamp:2026-03-09 13:20:50.138252187 +0000 UTC m=+5.388424095,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.806276 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee781cf1248 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:50.150576712 +0000 UTC m=+5.400748620,LastTimestamp:2026-03-09 13:20:50.150576712 +0000 UTC m=+5.400748620,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.810216 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee781e3a579 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:50.151925113 +0000 UTC m=+5.402097041,LastTimestamp:2026-03-09 13:20:50.151925113 +0000 UTC m=+5.402097041,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.813594 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee78d19debc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:50.340028092 +0000 UTC m=+5.590200010,LastTimestamp:2026-03-09 13:20:50.340028092 +0000 UTC m=+5.590200010,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.817625 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee78dcccc17 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:50.351754263 +0000 UTC m=+5.601926171,LastTimestamp:2026-03-09 13:20:50.351754263 +0000 UTC m=+5.601926171,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.820950 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee78ddfdea0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:50.353004192 +0000 UTC m=+5.603176100,LastTimestamp:2026-03-09 13:20:50.353004192 +0000 UTC m=+5.603176100,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.824800 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee796fd4daf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:50.505928111 +0000 UTC m=+5.756100019,LastTimestamp:2026-03-09 13:20:50.505928111 +0000 UTC m=+5.756100019,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.827772 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee79789d9f6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:50.515139062 +0000 UTC m=+5.765310970,LastTimestamp:2026-03-09 13:20:50.515139062 +0000 UTC m=+5.765310970,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.833323 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 13:21:49 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2ee8e3ce739d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 09 13:21:49 crc kubenswrapper[4764]: body: Mar 09 13:21:49 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:56.089670557 +0000 UTC m=+11.339842495,LastTimestamp:2026-03-09 13:20:56.089670557 +0000 UTC m=+11.339842495,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:21:49 crc kubenswrapper[4764]: > Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.837019 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee8e3cfea81 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:56.089766529 +0000 UTC m=+11.339938447,LastTimestamp:2026-03-09 13:20:56.089766529 +0000 UTC m=+11.339938447,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.840567 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 13:21:49 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-apiserver-crc.189b2ee99637bf19 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:58756->192.168.126.11:17697: read: connection reset by peer Mar 09 13:21:49 crc kubenswrapper[4764]: body: Mar 09 13:21:49 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:59.082915609 +0000 UTC m=+14.333087537,LastTimestamp:2026-03-09 13:20:59.082915609 +0000 UTC m=+14.333087537,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:21:49 crc kubenswrapper[4764]: > Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.844400 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee996388eea openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58756->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:59.08296881 +0000 UTC m=+14.333140738,LastTimestamp:2026-03-09 13:20:59.08296881 +0000 UTC m=+14.333140738,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.849447 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 13:21:49 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-apiserver-crc.189b2ee9a99b35e1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 09 13:21:49 crc kubenswrapper[4764]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 09 13:21:49 crc kubenswrapper[4764]: Mar 09 13:21:49 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:59.408201185 +0000 UTC m=+14.658373103,LastTimestamp:2026-03-09 13:20:59.408201185 +0000 UTC m=+14.658373103,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:21:49 crc kubenswrapper[4764]: > Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.852822 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee9a99bd790 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:59.408242576 +0000 UTC m=+14.658414504,LastTimestamp:2026-03-09 13:20:59.408242576 +0000 UTC m=+14.658414504,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.856550 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b2ee9a99b35e1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 13:21:49 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-apiserver-crc.189b2ee9a99b35e1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 09 13:21:49 crc kubenswrapper[4764]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 09 13:21:49 crc kubenswrapper[4764]: Mar 09 13:21:49 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:59.408201185 +0000 UTC m=+14.658373103,LastTimestamp:2026-03-09 13:20:59.414788 +0000 UTC m=+14.664959908,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:21:49 crc kubenswrapper[4764]: > Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.859696 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b2ee9a99bd790\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee9a99bd790 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:59.408242576 +0000 UTC m=+14.658414504,LastTimestamp:2026-03-09 13:20:59.414822691 +0000 UTC m=+14.664994599,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.863256 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b2ee7161286eb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee7161286eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.343058155 +0000 UTC m=+3.593230073,LastTimestamp:2026-03-09 13:20:59.64380987 +0000 UTC m=+14.893981778,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.869451 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 13:21:49 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2eeb37ea9416 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 13:21:49 crc kubenswrapper[4764]: body: Mar 09 13:21:49 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:21:06.090734614 +0000 UTC m=+21.340906542,LastTimestamp:2026-03-09 13:21:06.090734614 +0000 UTC m=+21.340906542,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:21:49 crc kubenswrapper[4764]: > Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.874126 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2eeb37eb4e2f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:21:06.090782255 +0000 UTC m=+21.340954163,LastTimestamp:2026-03-09 13:21:06.090782255 +0000 UTC m=+21.340954163,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.877727 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2eeb37ea9416\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 13:21:49 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2eeb37ea9416 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 13:21:49 crc kubenswrapper[4764]: body: Mar 09 13:21:49 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:21:06.090734614 +0000 UTC m=+21.340906542,LastTimestamp:2026-03-09 13:21:16.090163232 +0000 UTC m=+31.340335170,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:21:49 crc kubenswrapper[4764]: > Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.880662 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2eeb37eb4e2f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2eeb37eb4e2f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:21:06.090782255 +0000 UTC m=+21.340954163,LastTimestamp:2026-03-09 13:21:16.090308845 +0000 UTC m=+31.340480793,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.883515 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2eed8c1cfe41 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:21:16.093259329 +0000 UTC m=+31.343431247,LastTimestamp:2026-03-09 13:21:16.093259329 +0000 UTC m=+31.343431247,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.886579 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2ee6b5cb3b48\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6b5cb3b48 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.727773 +0000 UTC m=+1.977944908,LastTimestamp:2026-03-09 13:21:16.258092657 +0000 UTC m=+31.508264565,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.889599 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2ee6ca74dc64\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6ca74dc64 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.074434148 +0000 UTC m=+2.324606056,LastTimestamp:2026-03-09 13:21:16.443096842 +0000 UTC m=+31.693268740,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.892515 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2ee6cb076563\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6cb076563 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.084037475 +0000 UTC m=+2.334209383,LastTimestamp:2026-03-09 13:21:16.454293273 +0000 UTC m=+31.704465181,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.897085 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2eeb37ea9416\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 13:21:49 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2eeb37ea9416 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 13:21:49 crc kubenswrapper[4764]: body: Mar 09 13:21:49 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:21:06.090734614 +0000 UTC m=+21.340906542,LastTimestamp:2026-03-09 13:21:26.090168382 +0000 UTC m=+41.340340300,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:21:49 crc kubenswrapper[4764]: > Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.900565 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2eeb37eb4e2f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2eeb37eb4e2f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:21:06.090782255 +0000 UTC m=+21.340954163,LastTimestamp:2026-03-09 13:21:26.090219873 +0000 UTC m=+41.340391781,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.904958 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2eeb37ea9416\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 13:21:49 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2eeb37ea9416 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 13:21:49 crc kubenswrapper[4764]: body: Mar 09 13:21:49 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:21:06.090734614 +0000 UTC m=+21.340906542,LastTimestamp:2026-03-09 13:21:36.090367668 +0000 UTC m=+51.340539616,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:21:49 crc kubenswrapper[4764]: > Mar 09 13:21:50 crc kubenswrapper[4764]: I0309 13:21:50.499240 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:50 crc kubenswrapper[4764]: I0309 13:21:50.766459 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:21:50 crc kubenswrapper[4764]: I0309 13:21:50.766676 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:50 crc kubenswrapper[4764]: I0309 13:21:50.767681 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:50 crc kubenswrapper[4764]: I0309 13:21:50.767716 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:50 crc kubenswrapper[4764]: I0309 13:21:50.767726 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:51 crc kubenswrapper[4764]: I0309 13:21:51.499885 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:52 crc kubenswrapper[4764]: I0309 13:21:52.497692 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:52 crc kubenswrapper[4764]: I0309 13:21:52.918854 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:21:52 crc kubenswrapper[4764]: I0309 13:21:52.919062 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:52 crc kubenswrapper[4764]: I0309 13:21:52.920194 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:52 crc kubenswrapper[4764]: I0309 13:21:52.920236 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:52 crc kubenswrapper[4764]: I0309 13:21:52.920248 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:52 crc kubenswrapper[4764]: I0309 13:21:52.920987 4764 scope.go:117] "RemoveContainer" containerID="033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356" Mar 09 13:21:52 crc kubenswrapper[4764]: E0309 13:21:52.921185 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.090110 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.090522 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.091735 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.091766 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.091778 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.095218 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.497562 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.559247 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.560302 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.560333 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.560345 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.817781 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.818691 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.818821 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.818908 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:54 crc kubenswrapper[4764]: I0309 13:21:54.497881 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:54 crc kubenswrapper[4764]: E0309 13:21:54.852760 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 13:21:54 crc kubenswrapper[4764]: I0309 13:21:54.856133 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:54 crc kubenswrapper[4764]: I0309 13:21:54.857122 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:54 crc kubenswrapper[4764]: I0309 13:21:54.857148 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:54 crc kubenswrapper[4764]: I0309 13:21:54.857157 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:54 crc kubenswrapper[4764]: I0309 13:21:54.857180 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:21:54 crc kubenswrapper[4764]: E0309 13:21:54.861019 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 13:21:55 crc kubenswrapper[4764]: I0309 13:21:55.497328 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:55 crc kubenswrapper[4764]: E0309 13:21:55.622316 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:21:56 crc kubenswrapper[4764]: I0309 13:21:56.498831 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:56 crc kubenswrapper[4764]: I0309 13:21:56.841653 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 13:21:56 crc kubenswrapper[4764]: I0309 13:21:56.855604 4764 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 09 13:21:57 crc kubenswrapper[4764]: W0309 13:21:57.237784 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 09 13:21:57 crc kubenswrapper[4764]: E0309 13:21:57.237862 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 09 13:21:57 crc kubenswrapper[4764]: I0309 13:21:57.498081 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:58 crc kubenswrapper[4764]: I0309 13:21:58.498915 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:59 crc kubenswrapper[4764]: I0309 13:21:59.499375 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:22:00 crc kubenswrapper[4764]: I0309 13:22:00.306032 4764 csr.go:261] certificate signing request csr-b28kp is approved, waiting to be issued Mar 09 13:22:00 crc kubenswrapper[4764]: I0309 13:22:00.312984 4764 csr.go:257] certificate signing request csr-b28kp is issued Mar 09 13:22:00 crc kubenswrapper[4764]: I0309 13:22:00.332064 4764 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 09 13:22:00 crc kubenswrapper[4764]: I0309 13:22:00.409881 4764 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 09 13:22:00 crc kubenswrapper[4764]: I0309 13:22:00.769631 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:22:00 crc kubenswrapper[4764]: I0309 13:22:00.769784 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:22:00 crc kubenswrapper[4764]: I0309 13:22:00.770945 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:00 crc kubenswrapper[4764]: I0309 13:22:00.770992 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:00 crc kubenswrapper[4764]: I0309 13:22:00.771004 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.314724 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-18 20:27:13.663276113 +0000 UTC Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.314780 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7567h5m12.348503808s for next certificate rotation Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.861692 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.863736 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.863900 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.864019 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.864239 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.874606 4764 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.875023 4764 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 09 13:22:01 crc kubenswrapper[4764]: E0309 13:22:01.875075 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.878871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.878902 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.878912 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.878927 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.878937 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:01Z","lastTransitionTime":"2026-03-09T13:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:01 crc kubenswrapper[4764]: E0309 13:22:01.897048 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.909290 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.909337 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.909351 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.909369 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.909381 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:01Z","lastTransitionTime":"2026-03-09T13:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:01 crc kubenswrapper[4764]: E0309 13:22:01.926767 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.936250 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.936305 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.936317 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.936335 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.936349 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:01Z","lastTransitionTime":"2026-03-09T13:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:01 crc kubenswrapper[4764]: E0309 13:22:01.947247 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.955214 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.955283 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.955296 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.955330 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.955343 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:01Z","lastTransitionTime":"2026-03-09T13:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:01 crc kubenswrapper[4764]: E0309 13:22:01.966250 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:01 crc kubenswrapper[4764]: E0309 13:22:01.966414 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:22:01 crc kubenswrapper[4764]: E0309 13:22:01.966448 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:02 crc kubenswrapper[4764]: E0309 13:22:02.066831 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:02 crc kubenswrapper[4764]: E0309 13:22:02.167604 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:02 crc kubenswrapper[4764]: E0309 13:22:02.268371 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:02 crc kubenswrapper[4764]: E0309 13:22:02.368968 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:02 crc kubenswrapper[4764]: E0309 13:22:02.469081 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:02 crc kubenswrapper[4764]: E0309 13:22:02.569761 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:02 crc kubenswrapper[4764]: E0309 13:22:02.670815 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:02 crc kubenswrapper[4764]: E0309 13:22:02.771384 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:02 crc kubenswrapper[4764]: E0309 13:22:02.872516 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:02 crc kubenswrapper[4764]: E0309 13:22:02.972611 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.072749 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.173394 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.273676 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.374612 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.475506 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:03 crc kubenswrapper[4764]: I0309 13:22:03.559690 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:22:03 crc kubenswrapper[4764]: I0309 13:22:03.560815 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:03 crc kubenswrapper[4764]: I0309 13:22:03.560853 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:03 crc kubenswrapper[4764]: I0309 13:22:03.560866 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:03 crc kubenswrapper[4764]: I0309 13:22:03.561587 4764 scope.go:117] "RemoveContainer" containerID="033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.561785 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.576786 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.677579 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.778486 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.879136 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.979874 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:04 crc kubenswrapper[4764]: E0309 13:22:04.080826 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:04 crc kubenswrapper[4764]: E0309 13:22:04.181794 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:04 crc kubenswrapper[4764]: E0309 13:22:04.282743 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:04 crc kubenswrapper[4764]: I0309 13:22:04.378115 4764 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 09 13:22:04 crc kubenswrapper[4764]: E0309 13:22:04.383005 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:04 crc kubenswrapper[4764]: E0309 13:22:04.484049 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:04 crc kubenswrapper[4764]: I0309 13:22:04.558906 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:22:04 crc kubenswrapper[4764]: I0309 13:22:04.560122 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:04 crc kubenswrapper[4764]: I0309 13:22:04.560178 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:04 crc kubenswrapper[4764]: I0309 13:22:04.560191 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:04 crc kubenswrapper[4764]: E0309 13:22:04.584373 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:04 crc kubenswrapper[4764]: E0309 13:22:04.685484 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:04 crc kubenswrapper[4764]: E0309 13:22:04.786509 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:04 crc kubenswrapper[4764]: E0309 13:22:04.887115 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:04 crc kubenswrapper[4764]: E0309 13:22:04.987449 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.088288 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.189022 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.289369 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.389810 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.490307 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.590775 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.623413 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.691529 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.792315 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.892722 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.993514 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:06 crc kubenswrapper[4764]: E0309 13:22:06.094343 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:06 crc kubenswrapper[4764]: E0309 13:22:06.195119 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:06 crc kubenswrapper[4764]: E0309 13:22:06.295816 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:06 crc kubenswrapper[4764]: E0309 13:22:06.396020 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:06 crc kubenswrapper[4764]: E0309 13:22:06.496888 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:06 crc kubenswrapper[4764]: E0309 13:22:06.597104 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:06 crc kubenswrapper[4764]: E0309 13:22:06.697959 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:06 crc kubenswrapper[4764]: E0309 13:22:06.798934 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:06 crc kubenswrapper[4764]: E0309 13:22:06.899844 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:07 crc kubenswrapper[4764]: E0309 13:22:07.000684 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:07 crc kubenswrapper[4764]: E0309 13:22:07.101676 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:07 crc kubenswrapper[4764]: E0309 13:22:07.202532 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:07 crc kubenswrapper[4764]: E0309 13:22:07.303439 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:07 crc kubenswrapper[4764]: E0309 13:22:07.404004 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:07 crc kubenswrapper[4764]: E0309 13:22:07.504785 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:07 crc kubenswrapper[4764]: E0309 13:22:07.605229 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:07 crc kubenswrapper[4764]: E0309 13:22:07.706370 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:07 crc kubenswrapper[4764]: E0309 13:22:07.807337 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:07 crc kubenswrapper[4764]: E0309 13:22:07.908035 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:08 crc kubenswrapper[4764]: E0309 13:22:08.008993 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:08 crc kubenswrapper[4764]: E0309 13:22:08.109976 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:08 crc kubenswrapper[4764]: E0309 13:22:08.211208 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:08 crc kubenswrapper[4764]: E0309 13:22:08.312286 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:08 crc kubenswrapper[4764]: E0309 13:22:08.413462 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:08 crc kubenswrapper[4764]: E0309 13:22:08.513870 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:08 crc kubenswrapper[4764]: E0309 13:22:08.614359 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:08 crc kubenswrapper[4764]: E0309 13:22:08.715437 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:08 crc kubenswrapper[4764]: E0309 13:22:08.816171 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:08 crc kubenswrapper[4764]: E0309 13:22:08.917198 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:09 crc kubenswrapper[4764]: E0309 13:22:09.017549 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:09 crc kubenswrapper[4764]: E0309 13:22:09.118380 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:09 crc kubenswrapper[4764]: E0309 13:22:09.218515 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:09 crc kubenswrapper[4764]: E0309 13:22:09.318982 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:09 crc kubenswrapper[4764]: E0309 13:22:09.419122 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:09 crc kubenswrapper[4764]: E0309 13:22:09.519546 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:09 crc kubenswrapper[4764]: E0309 13:22:09.620350 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:09 crc kubenswrapper[4764]: E0309 13:22:09.721619 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:09 crc kubenswrapper[4764]: E0309 13:22:09.822814 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:09 crc kubenswrapper[4764]: E0309 13:22:09.923315 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:10 crc kubenswrapper[4764]: E0309 13:22:10.023827 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:10 crc kubenswrapper[4764]: E0309 13:22:10.124932 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:10 crc kubenswrapper[4764]: E0309 13:22:10.226081 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:10 crc kubenswrapper[4764]: E0309 13:22:10.326781 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:10 crc kubenswrapper[4764]: E0309 13:22:10.427248 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:10 crc kubenswrapper[4764]: E0309 13:22:10.527969 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:10 crc kubenswrapper[4764]: E0309 13:22:10.628824 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:10 crc kubenswrapper[4764]: E0309 13:22:10.729967 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:10 crc kubenswrapper[4764]: E0309 13:22:10.830349 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:10 crc kubenswrapper[4764]: E0309 13:22:10.931146 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:11 crc kubenswrapper[4764]: E0309 13:22:11.031293 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:11 crc kubenswrapper[4764]: E0309 13:22:11.131475 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:11 crc kubenswrapper[4764]: E0309 13:22:11.232549 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:11 crc kubenswrapper[4764]: E0309 13:22:11.333555 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:11 crc kubenswrapper[4764]: E0309 13:22:11.434727 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:11 crc kubenswrapper[4764]: E0309 13:22:11.535011 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:11 crc kubenswrapper[4764]: E0309 13:22:11.635586 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:11 crc kubenswrapper[4764]: E0309 13:22:11.736718 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:11 crc kubenswrapper[4764]: E0309 13:22:11.837722 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:11 crc kubenswrapper[4764]: E0309 13:22:11.938743 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.039847 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.140924 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.241186 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.342452 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.360955 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.364721 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.364750 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.364759 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.364772 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.364782 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:12Z","lastTransitionTime":"2026-03-09T13:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.374238 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.382506 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.382550 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.382560 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.382579 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.382589 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:12Z","lastTransitionTime":"2026-03-09T13:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.391992 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.395359 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.395408 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.395418 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.395433 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.395462 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:12Z","lastTransitionTime":"2026-03-09T13:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.404787 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.407859 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.407884 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.407892 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.407905 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.407915 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:12Z","lastTransitionTime":"2026-03-09T13:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.416860 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.416959 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.443381 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.544379 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.644947 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.745357 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.845723 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.946860 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:13 crc kubenswrapper[4764]: E0309 13:22:13.047951 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:13 crc kubenswrapper[4764]: E0309 13:22:13.148183 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:13 crc kubenswrapper[4764]: E0309 13:22:13.248366 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:13 crc kubenswrapper[4764]: E0309 13:22:13.348912 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:13 crc kubenswrapper[4764]: E0309 13:22:13.449597 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:13 crc kubenswrapper[4764]: E0309 13:22:13.550314 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:13 crc kubenswrapper[4764]: E0309 13:22:13.651271 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:13 crc kubenswrapper[4764]: E0309 13:22:13.751815 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:13 crc kubenswrapper[4764]: E0309 13:22:13.852762 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:13 crc kubenswrapper[4764]: E0309 13:22:13.953479 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.054428 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.155799 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.256394 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.356860 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.457245 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.521687 4764 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.531439 4764 apiserver.go:52] "Watching apiserver" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.536886 4764 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.537259 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.537810 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.537921 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.537980 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.538260 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.538310 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.537810 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.538409 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.539278 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.539378 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.539762 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.541763 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.542008 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.542245 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.542346 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.542369 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.543002 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.543153 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.545023 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.560007 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.560065 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.560078 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.560096 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.560109 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:14Z","lastTransitionTime":"2026-03-09T13:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.570942 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.580789 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.591898 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.595327 4764 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.600973 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606472 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606525 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606558 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606587 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606614 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606670 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606701 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606729 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606757 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606789 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606823 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606849 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606878 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606935 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606949 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606949 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607032 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607052 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607069 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607087 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607102 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607163 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607180 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607197 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607213 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607227 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607242 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607257 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607271 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607285 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607299 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607314 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607331 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607346 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607361 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607376 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607392 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607409 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607427 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607442 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607458 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607477 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607494 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607509 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607547 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607564 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607578 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607594 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607611 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607625 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607654 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607691 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607706 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607724 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607739 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607754 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607771 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607788 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607804 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607822 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607840 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607855 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607870 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607910 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607950 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607966 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607980 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607762 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607785 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608957 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607943 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608979 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609003 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608041 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608147 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608143 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608263 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608316 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608404 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608563 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608691 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608707 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608762 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608809 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608893 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608984 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609140 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609212 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609220 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609315 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609555 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609602 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609623 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609656 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609672 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609686 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609701 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609719 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609736 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609752 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609768 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609784 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609800 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609815 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609830 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609845 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609860 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609876 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609890 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609906 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609911 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609921 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609938 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609953 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609968 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609984 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609998 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610015 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610033 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610048 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610063 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610077 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610097 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610115 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610131 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610146 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610161 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610176 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610168 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610191 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610208 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610222 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610236 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610255 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610276 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610300 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610316 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610326 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610334 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610387 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610448 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610475 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610496 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610513 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610552 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610584 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610606 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610628 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610664 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610685 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610709 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610732 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610753 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610776 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610799 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610820 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610842 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610865 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610889 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610915 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610933 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610950 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610966 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610981 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610997 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611013 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611027 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611044 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611060 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611075 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611090 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611105 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611121 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611136 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611152 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611168 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611190 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611215 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611236 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611252 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611271 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611288 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611303 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611320 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611337 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611352 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611367 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611383 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611399 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611415 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611431 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611446 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611464 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611485 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611508 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611524 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611549 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611565 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611581 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611597 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611611 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611627 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611656 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611673 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611692 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611716 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611741 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611765 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611792 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611813 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611829 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611846 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611863 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611879 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611895 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611912 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611994 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612014 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612033 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612048 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612065 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612082 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612099 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612116 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612200 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612230 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612252 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612273 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612291 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612313 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612351 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612381 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612405 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612430 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612453 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612477 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612513 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612537 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612596 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612610 4764 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612622 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612637 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612667 4764 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612682 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612694 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612706 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612718 4764 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612731 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612743 4764 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612756 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612803 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612818 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612832 4764 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612846 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612858 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612868 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612878 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612887 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612896 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612907 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612916 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612925 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612935 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612946 4764 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612954 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612964 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612973 4764 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.613702 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610818 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610867 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611059 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611628 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611626 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611759 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612126 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612169 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612139 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612511 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612523 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612612 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612772 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.613439 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.613456 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.613825 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.613894 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.613923 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.613988 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.614158 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.614248 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.614257 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.614415 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.614772 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.614952 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.615148 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.615201 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.615280 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.615398 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.615697 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.615857 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.615884 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.616052 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.613246 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.616224 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.616550 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.617168 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.617184 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.617139 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.617094 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.617333 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.617632 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.617852 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.617958 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.617994 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.618187 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.623224 4764 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.623837 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.623849 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.624292 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.624567 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.624739 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.624922 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.624980 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.625012 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.625190 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.625489 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.625732 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.625757 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.625763 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.625869 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.626138 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.626852 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.627996 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.628206 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.628268 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:15.128247985 +0000 UTC m=+90.378419893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.628877 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.630475 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:22:15.130454773 +0000 UTC m=+90.380626701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.630883 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.631457 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.631487 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.634043 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:15.133509113 +0000 UTC m=+90.383681031 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.636915 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.637276 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.640789 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.643588 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.643614 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.643629 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.643720 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:15.143701191 +0000 UTC m=+90.393873109 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.647067 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.647565 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.647566 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.647893 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.649475 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.649916 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.649946 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.650139 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.650393 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.650894 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.651067 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.655279 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.655359 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.655769 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.655930 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.656916 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.657410 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.655815 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.657510 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.657597 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.658046 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.658433 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.658690 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.658830 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.659038 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.659423 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.659489 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.659828 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.659848 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.659859 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.659904 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:15.159889156 +0000 UTC m=+90.410061054 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.660086 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.661731 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.661743 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.661821 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.662229 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.662283 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.662866 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.663315 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.663373 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.663357 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.663497 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.663519 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.663529 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.663545 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.663556 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:14Z","lastTransitionTime":"2026-03-09T13:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.663773 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.664842 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.665075 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.665134 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.665172 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.665836 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.665908 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.665920 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.665958 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.665508 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.666150 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.664979 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.666696 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.666412 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.666502 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.666939 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.666956 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.667042 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.666587 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.666895 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.667229 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.667389 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.667574 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.667876 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.667939 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.667949 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.668033 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.668281 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.668288 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.668596 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.668957 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.668909 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.669199 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.669206 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.669277 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.669335 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.669407 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.669523 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.669721 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.669790 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.669489 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.669969 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.670254 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.670706 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.670790 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.673718 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.673857 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.673928 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.673933 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.673956 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.674072 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.674249 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.674461 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.674540 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.674548 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.675163 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.676271 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.676319 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.676504 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.676798 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.677090 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.676677 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.677724 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.677968 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.678075 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.683992 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.686412 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.702052 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.705565 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713625 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713692 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713718 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713773 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713794 4764 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713803 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713814 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713823 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713831 4764 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713839 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713849 4764 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713857 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713851 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713865 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713911 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713924 4764 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713934 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713943 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713952 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713961 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713970 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713978 4764 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713986 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713995 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714003 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714011 4764 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714020 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714029 4764 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714039 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714049 4764 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714057 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714066 4764 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714076 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714083 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714092 4764 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714100 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714109 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714117 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714125 4764 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714134 4764 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714143 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714151 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714159 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714167 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714176 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714185 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714194 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714204 4764 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714212 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714221 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714229 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714237 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714245 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714254 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714261 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714269 4764 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714277 4764 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714285 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714294 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714306 4764 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714316 4764 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714328 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714339 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714380 4764 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714389 4764 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714400 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714410 4764 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714420 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714430 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714440 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714449 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714457 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714466 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714476 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714485 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714495 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714505 4764 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714515 4764 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714524 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714533 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714545 4764 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714555 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714564 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714575 4764 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714584 4764 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714594 4764 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714603 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714611 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714620 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714629 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714637 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714659 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714668 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714677 4764 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714687 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714694 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714702 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714711 4764 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714719 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714729 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714738 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714751 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714760 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714770 4764 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714779 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714788 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714796 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714807 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714815 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714823 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714833 4764 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714843 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714853 4764 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714861 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714869 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714877 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714885 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714893 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714900 4764 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714908 4764 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714916 4764 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714924 4764 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714932 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714939 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714948 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714955 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714964 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714971 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714979 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714987 4764 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714994 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715002 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715011 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715019 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715027 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715036 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715044 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715052 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715061 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715070 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715078 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715100 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715108 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715117 4764 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715125 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715133 4764 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715140 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715149 4764 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715157 4764 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715165 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715173 4764 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715181 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715189 4764 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715197 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715205 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715213 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715221 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715229 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715237 4764 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715244 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715252 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715260 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715268 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715276 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715284 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715293 4764 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715301 4764 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715309 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715317 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.769367 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.769571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.769599 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.769617 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.769633 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:14Z","lastTransitionTime":"2026-03-09T13:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.855685 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.863746 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:22:14 crc kubenswrapper[4764]: W0309 13:22:14.866743 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-e5d77a6f769d325d48bde0d33b59e1563f232c38daf331f3dc8506a16387e41b WatchSource:0}: Error finding container e5d77a6f769d325d48bde0d33b59e1563f232c38daf331f3dc8506a16387e41b: Status 404 returned error can't find the container with id e5d77a6f769d325d48bde0d33b59e1563f232c38daf331f3dc8506a16387e41b Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.868996 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:14 crc kubenswrapper[4764]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 09 13:22:14 crc kubenswrapper[4764]: if [[ -f "/env/_master" ]]; then Mar 09 13:22:14 crc kubenswrapper[4764]: set -o allexport Mar 09 13:22:14 crc kubenswrapper[4764]: source "/env/_master" Mar 09 13:22:14 crc kubenswrapper[4764]: set +o allexport Mar 09 13:22:14 crc kubenswrapper[4764]: fi Mar 09 13:22:14 crc kubenswrapper[4764]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 09 13:22:14 crc kubenswrapper[4764]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 09 13:22:14 crc kubenswrapper[4764]: ho_enable="--enable-hybrid-overlay" Mar 09 13:22:14 crc kubenswrapper[4764]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 09 13:22:14 crc kubenswrapper[4764]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 09 13:22:14 crc kubenswrapper[4764]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 09 13:22:14 crc kubenswrapper[4764]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 09 13:22:14 crc kubenswrapper[4764]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 09 13:22:14 crc kubenswrapper[4764]: --webhook-host=127.0.0.1 \ Mar 09 13:22:14 crc kubenswrapper[4764]: --webhook-port=9743 \ Mar 09 13:22:14 crc kubenswrapper[4764]: ${ho_enable} \ Mar 09 13:22:14 crc kubenswrapper[4764]: --enable-interconnect \ Mar 09 13:22:14 crc kubenswrapper[4764]: --disable-approver \ Mar 09 13:22:14 crc kubenswrapper[4764]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 09 13:22:14 crc kubenswrapper[4764]: --wait-for-kubernetes-api=200s \ Mar 09 13:22:14 crc kubenswrapper[4764]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 09 13:22:14 crc kubenswrapper[4764]: --loglevel="${LOGLEVEL}" Mar 09 13:22:14 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:14 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.871693 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.871887 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.871929 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.871940 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.871958 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.871970 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:14Z","lastTransitionTime":"2026-03-09T13:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.872076 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:14 crc kubenswrapper[4764]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 09 13:22:14 crc kubenswrapper[4764]: if [[ -f "/env/_master" ]]; then Mar 09 13:22:14 crc kubenswrapper[4764]: set -o allexport Mar 09 13:22:14 crc kubenswrapper[4764]: source "/env/_master" Mar 09 13:22:14 crc kubenswrapper[4764]: set +o allexport Mar 09 13:22:14 crc kubenswrapper[4764]: fi Mar 09 13:22:14 crc kubenswrapper[4764]: Mar 09 13:22:14 crc kubenswrapper[4764]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 09 13:22:14 crc kubenswrapper[4764]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 09 13:22:14 crc kubenswrapper[4764]: --disable-webhook \ Mar 09 13:22:14 crc kubenswrapper[4764]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 09 13:22:14 crc kubenswrapper[4764]: --loglevel="${LOGLEVEL}" Mar 09 13:22:14 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:14 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.873891 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 09 13:22:14 crc kubenswrapper[4764]: W0309 13:22:14.875048 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-0b14fc603ec2fd3d4793aad753f5d6dff5d9f5af18473d9a20b44a72babf41b0 WatchSource:0}: Error finding container 0b14fc603ec2fd3d4793aad753f5d6dff5d9f5af18473d9a20b44a72babf41b0: Status 404 returned error can't find the container with id 0b14fc603ec2fd3d4793aad753f5d6dff5d9f5af18473d9a20b44a72babf41b0 Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.877159 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:14 crc kubenswrapper[4764]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 09 13:22:14 crc kubenswrapper[4764]: set -o allexport Mar 09 13:22:14 crc kubenswrapper[4764]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 09 13:22:14 crc kubenswrapper[4764]: source /etc/kubernetes/apiserver-url.env Mar 09 13:22:14 crc kubenswrapper[4764]: else Mar 09 13:22:14 crc kubenswrapper[4764]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 09 13:22:14 crc kubenswrapper[4764]: exit 1 Mar 09 13:22:14 crc kubenswrapper[4764]: fi Mar 09 13:22:14 crc kubenswrapper[4764]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 09 13:22:14 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:14 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.878613 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.885578 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.886905 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.973792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.973825 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.973833 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.973845 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.973854 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:14Z","lastTransitionTime":"2026-03-09T13:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.075688 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.075725 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.075734 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.075747 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.075758 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:15Z","lastTransitionTime":"2026-03-09T13:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.178345 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.178383 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.178392 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.178406 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.178415 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:15Z","lastTransitionTime":"2026-03-09T13:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.218983 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.219079 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219154 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:22:16.219125445 +0000 UTC m=+91.469297363 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219167 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.219212 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219233 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:16.219223498 +0000 UTC m=+91.469395416 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.219271 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.219325 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219337 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219360 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219377 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219428 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219441 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:16.219418953 +0000 UTC m=+91.469590901 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219460 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:16.219451364 +0000 UTC m=+91.469623282 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219526 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219540 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219552 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219591 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:16.219578567 +0000 UTC m=+91.469750575 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.281173 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.281212 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.281220 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.281234 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.281243 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:15Z","lastTransitionTime":"2026-03-09T13:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.384066 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.384179 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.384222 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.384241 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.384253 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:15Z","lastTransitionTime":"2026-03-09T13:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.486504 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.486561 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.486574 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.486591 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.486602 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:15Z","lastTransitionTime":"2026-03-09T13:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.563075 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.563954 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.565238 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.566026 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.567238 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.568002 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.568710 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.569839 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.570617 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.571189 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.571859 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.572543 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.573835 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.574451 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.575090 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.576173 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.576940 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.578103 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.578676 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.579360 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.580817 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.581549 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.582102 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.583102 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.583890 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.585585 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.586232 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.587030 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.589714 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.589868 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.589893 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.589904 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.589923 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.589972 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:15Z","lastTransitionTime":"2026-03-09T13:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.590266 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.591337 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.591842 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.592875 4764 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.592981 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.595128 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.596119 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.596521 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.596821 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.598315 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.599136 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.600266 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.601054 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.602332 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.602945 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.604122 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.604880 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.605480 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.606086 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.606640 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.607686 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.608338 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.609904 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.610528 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.611396 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.611955 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.612926 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.613484 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.613967 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.615756 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.623943 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.691485 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.691520 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.691529 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.691544 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.691553 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:15Z","lastTransitionTime":"2026-03-09T13:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.794197 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.794272 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.794294 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.794323 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.794351 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:15Z","lastTransitionTime":"2026-03-09T13:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.868696 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0b3c4a4cbe4daceca4eb2a00886df91a34c91833b42afef05407e8fa7c8e57a1"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.870507 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0b14fc603ec2fd3d4793aad753f5d6dff5d9f5af18473d9a20b44a72babf41b0"} Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.870698 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.871502 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e5d77a6f769d325d48bde0d33b59e1563f232c38daf331f3dc8506a16387e41b"} Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.871798 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:15 crc kubenswrapper[4764]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 09 13:22:15 crc kubenswrapper[4764]: set -o allexport Mar 09 13:22:15 crc kubenswrapper[4764]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 09 13:22:15 crc kubenswrapper[4764]: source /etc/kubernetes/apiserver-url.env Mar 09 13:22:15 crc kubenswrapper[4764]: else Mar 09 13:22:15 crc kubenswrapper[4764]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 09 13:22:15 crc kubenswrapper[4764]: exit 1 Mar 09 13:22:15 crc kubenswrapper[4764]: fi Mar 09 13:22:15 crc kubenswrapper[4764]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 09 13:22:15 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:15 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.871998 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.872867 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:15 crc kubenswrapper[4764]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 09 13:22:15 crc kubenswrapper[4764]: if [[ -f "/env/_master" ]]; then Mar 09 13:22:15 crc kubenswrapper[4764]: set -o allexport Mar 09 13:22:15 crc kubenswrapper[4764]: source "/env/_master" Mar 09 13:22:15 crc kubenswrapper[4764]: set +o allexport Mar 09 13:22:15 crc kubenswrapper[4764]: fi Mar 09 13:22:15 crc kubenswrapper[4764]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 09 13:22:15 crc kubenswrapper[4764]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 09 13:22:15 crc kubenswrapper[4764]: ho_enable="--enable-hybrid-overlay" Mar 09 13:22:15 crc kubenswrapper[4764]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 09 13:22:15 crc kubenswrapper[4764]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 09 13:22:15 crc kubenswrapper[4764]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 09 13:22:15 crc kubenswrapper[4764]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 09 13:22:15 crc kubenswrapper[4764]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 09 13:22:15 crc kubenswrapper[4764]: --webhook-host=127.0.0.1 \ Mar 09 13:22:15 crc kubenswrapper[4764]: --webhook-port=9743 \ Mar 09 13:22:15 crc kubenswrapper[4764]: ${ho_enable} \ Mar 09 13:22:15 crc kubenswrapper[4764]: --enable-interconnect \ Mar 09 13:22:15 crc kubenswrapper[4764]: --disable-approver \ Mar 09 13:22:15 crc kubenswrapper[4764]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 09 13:22:15 crc kubenswrapper[4764]: --wait-for-kubernetes-api=200s \ Mar 09 13:22:15 crc kubenswrapper[4764]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 09 13:22:15 crc kubenswrapper[4764]: --loglevel="${LOGLEVEL}" Mar 09 13:22:15 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:15 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.873889 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.874628 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:15 crc kubenswrapper[4764]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 09 13:22:15 crc kubenswrapper[4764]: if [[ -f "/env/_master" ]]; then Mar 09 13:22:15 crc kubenswrapper[4764]: set -o allexport Mar 09 13:22:15 crc kubenswrapper[4764]: source "/env/_master" Mar 09 13:22:15 crc kubenswrapper[4764]: set +o allexport Mar 09 13:22:15 crc kubenswrapper[4764]: fi Mar 09 13:22:15 crc kubenswrapper[4764]: Mar 09 13:22:15 crc kubenswrapper[4764]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 09 13:22:15 crc kubenswrapper[4764]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 09 13:22:15 crc kubenswrapper[4764]: --disable-webhook \ Mar 09 13:22:15 crc kubenswrapper[4764]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 09 13:22:15 crc kubenswrapper[4764]: --loglevel="${LOGLEVEL}" Mar 09 13:22:15 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:15 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.876557 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.879313 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.888572 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.896492 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.896526 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.896536 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.896551 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.896562 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:15Z","lastTransitionTime":"2026-03-09T13:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.899680 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.909766 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.921587 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.932364 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.943375 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.952235 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.961365 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.969743 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.977675 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.988721 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.999041 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.999065 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.999073 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.999103 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.999112 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:15Z","lastTransitionTime":"2026-03-09T13:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.101310 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.101347 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.101357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.101372 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.101383 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:16Z","lastTransitionTime":"2026-03-09T13:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.203714 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.203771 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.203783 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.203795 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.203804 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:16Z","lastTransitionTime":"2026-03-09T13:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.228323 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.228412 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:22:18.228395196 +0000 UTC m=+93.478567104 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.228465 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.228494 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.228515 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.228584 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.228605 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.228624 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:18.228614352 +0000 UTC m=+93.478786250 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.228663 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:18.228633342 +0000 UTC m=+93.478805250 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.228919 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.228977 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.229010 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.229028 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.229031 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.229043 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.229045 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.229082 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:18.229070554 +0000 UTC m=+93.479242462 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.229100 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:18.229091514 +0000 UTC m=+93.479263422 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.306091 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.306164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.306178 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.306195 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.306241 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:16Z","lastTransitionTime":"2026-03-09T13:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.408664 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.408715 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.408731 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.408749 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.408765 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:16Z","lastTransitionTime":"2026-03-09T13:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.510431 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.510454 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.510463 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.510476 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.510484 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:16Z","lastTransitionTime":"2026-03-09T13:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.559019 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.559119 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.559164 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.559290 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.559557 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.559617 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.575360 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.576791 4764 scope.go:117] "RemoveContainer" containerID="033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356" Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.577141 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.612398 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.612438 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.612449 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.612464 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.612475 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:16Z","lastTransitionTime":"2026-03-09T13:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.714807 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.714858 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.714868 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.714882 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.714891 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:16Z","lastTransitionTime":"2026-03-09T13:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.817567 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.817608 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.817617 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.817635 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.817663 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:16Z","lastTransitionTime":"2026-03-09T13:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.874351 4764 scope.go:117] "RemoveContainer" containerID="033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356" Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.874513 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.919158 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.919187 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.919195 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.919208 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.919217 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:16Z","lastTransitionTime":"2026-03-09T13:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.021795 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.021835 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.021847 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.021863 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.021874 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.124575 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.124615 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.124623 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.124637 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.124662 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.226407 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.226444 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.226455 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.226472 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.226485 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.329182 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.329268 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.329293 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.329323 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.329342 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.431218 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.431285 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.431303 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.431328 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.431347 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.534099 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.534155 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.534166 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.534183 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.534195 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.636389 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.636427 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.636438 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.636452 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.636462 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.738388 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.738505 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.738522 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.738538 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.738546 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.841568 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.841685 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.841723 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.841752 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.841771 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.944008 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.944085 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.944107 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.944137 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.944158 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.046179 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.046221 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.046233 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.046249 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.046261 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:18Z","lastTransitionTime":"2026-03-09T13:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.148761 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.148793 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.148804 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.148817 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.148828 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:18Z","lastTransitionTime":"2026-03-09T13:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.246387 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.246475 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.246508 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246531 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:22:22.246512969 +0000 UTC m=+97.496684877 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.246555 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246578 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246660 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:22.246606961 +0000 UTC m=+97.496778869 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246679 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246696 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246708 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246741 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:22.246727575 +0000 UTC m=+97.496899483 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.246577 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246812 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246824 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246833 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246853 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246856 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:22.246849488 +0000 UTC m=+97.497021396 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.247013 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:22.246959191 +0000 UTC m=+97.497131129 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.251202 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.251271 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.251295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.251326 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.251347 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:18Z","lastTransitionTime":"2026-03-09T13:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.353742 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.353833 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.353844 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.353861 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.353876 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:18Z","lastTransitionTime":"2026-03-09T13:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.455747 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.456001 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.456068 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.456143 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.456240 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:18Z","lastTransitionTime":"2026-03-09T13:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.558590 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.558608 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.558590 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.558708 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.558763 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.558797 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.558813 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.558820 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.558833 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.558846 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.558855 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:18Z","lastTransitionTime":"2026-03-09T13:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.661234 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.661563 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.661638 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.661734 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.661803 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:18Z","lastTransitionTime":"2026-03-09T13:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.763737 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.763779 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.763791 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.763809 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.763822 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:18Z","lastTransitionTime":"2026-03-09T13:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.868752 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.868816 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.868832 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.868857 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.868873 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:18Z","lastTransitionTime":"2026-03-09T13:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.971482 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.971521 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.971532 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.971547 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.971556 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:18Z","lastTransitionTime":"2026-03-09T13:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.073363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.073392 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.073400 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.073414 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.073423 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:19Z","lastTransitionTime":"2026-03-09T13:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.175793 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.175838 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.175847 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.175860 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.175870 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:19Z","lastTransitionTime":"2026-03-09T13:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.277701 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.277743 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.277758 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.277773 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.277785 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:19Z","lastTransitionTime":"2026-03-09T13:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.379731 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.379768 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.379780 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.379794 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.379803 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:19Z","lastTransitionTime":"2026-03-09T13:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.482512 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.482772 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.482875 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.482954 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.483017 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:19Z","lastTransitionTime":"2026-03-09T13:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.585177 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.585436 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.585531 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.585606 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.585685 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:19Z","lastTransitionTime":"2026-03-09T13:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.688949 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.689003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.689018 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.689038 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.689056 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:19Z","lastTransitionTime":"2026-03-09T13:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.792307 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.792380 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.792396 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.792421 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.792442 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:19Z","lastTransitionTime":"2026-03-09T13:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.894962 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.895015 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.895033 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.895059 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.895076 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:19Z","lastTransitionTime":"2026-03-09T13:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.998339 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.998680 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.998863 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.999004 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.999150 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:19Z","lastTransitionTime":"2026-03-09T13:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.101668 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.101696 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.101704 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.101716 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.101724 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:20Z","lastTransitionTime":"2026-03-09T13:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.205095 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.205183 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.205208 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.205242 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.205264 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:20Z","lastTransitionTime":"2026-03-09T13:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.307828 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.307868 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.307882 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.307898 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.307908 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:20Z","lastTransitionTime":"2026-03-09T13:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.409947 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.409997 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.410007 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.410020 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.410029 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:20Z","lastTransitionTime":"2026-03-09T13:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.512235 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.512287 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.512303 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.512326 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.512341 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:20Z","lastTransitionTime":"2026-03-09T13:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.558912 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.558918 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.558937 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:20 crc kubenswrapper[4764]: E0309 13:22:20.559056 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:20 crc kubenswrapper[4764]: E0309 13:22:20.559340 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:20 crc kubenswrapper[4764]: E0309 13:22:20.559439 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.614369 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.614406 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.614420 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.614436 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.614447 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:20Z","lastTransitionTime":"2026-03-09T13:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.716735 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.716770 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.716780 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.716796 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.716807 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:20Z","lastTransitionTime":"2026-03-09T13:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.820584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.820624 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.820633 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.820663 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.820677 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:20Z","lastTransitionTime":"2026-03-09T13:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.922911 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.922964 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.922972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.922988 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.922999 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:20Z","lastTransitionTime":"2026-03-09T13:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.025220 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.025258 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.025273 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.025288 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.025300 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:21Z","lastTransitionTime":"2026-03-09T13:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.128087 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.128132 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.128143 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.128158 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.128170 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:21Z","lastTransitionTime":"2026-03-09T13:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.230990 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.231048 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.231061 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.231079 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.231095 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:21Z","lastTransitionTime":"2026-03-09T13:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.334403 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.334447 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.334459 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.334472 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.334482 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:21Z","lastTransitionTime":"2026-03-09T13:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.436862 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.436901 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.436909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.436923 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.436931 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:21Z","lastTransitionTime":"2026-03-09T13:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.539284 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.539331 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.539342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.539357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.539377 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:21Z","lastTransitionTime":"2026-03-09T13:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.641798 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.641840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.641850 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.641865 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.641875 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:21Z","lastTransitionTime":"2026-03-09T13:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.744435 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.744473 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.744482 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.744495 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.744503 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:21Z","lastTransitionTime":"2026-03-09T13:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.847782 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.847829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.847840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.847858 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.847870 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:21Z","lastTransitionTime":"2026-03-09T13:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.950099 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.950158 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.950172 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.950189 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.950202 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:21Z","lastTransitionTime":"2026-03-09T13:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.052382 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.052444 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.052455 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.052472 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.052482 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.155209 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.155243 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.155250 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.155265 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.155302 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.257495 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.257586 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.257608 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.257681 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.257705 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.283059 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.283220 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.283277 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283334 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:22:30.283299938 +0000 UTC m=+105.533471856 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.283401 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283411 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.283466 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283531 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:30.283495653 +0000 UTC m=+105.533667691 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283628 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283701 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283701 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283723 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283738 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283746 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283810 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:30.283786651 +0000 UTC m=+105.533958569 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283739 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283900 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:30.283888754 +0000 UTC m=+105.534060672 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283922 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:30.283912214 +0000 UTC m=+105.534084132 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.360663 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.360725 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.360745 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.360771 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.360786 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.421610 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.421681 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.421690 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.421706 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.421716 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.432892 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.437288 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.437357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.437372 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.437394 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.437408 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.450726 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.456466 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.456519 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.456532 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.456551 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.456566 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.467952 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.473487 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.473548 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.473559 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.473584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.473595 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.490518 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.495601 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.495704 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.495718 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.495742 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.495757 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.510214 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.510468 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.512623 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.512702 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.512713 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.512732 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.512746 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.558988 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.559136 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.559251 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.559177 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.559374 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.559482 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.615766 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.615819 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.615830 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.615849 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.615863 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.718291 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.718354 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.718363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.718388 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.718401 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.820515 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.820547 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.820555 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.820571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.820580 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.924536 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.924591 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.924603 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.924621 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.924655 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.027250 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.027513 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.027600 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.027794 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.027931 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:23Z","lastTransitionTime":"2026-03-09T13:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.130783 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.130822 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.130834 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.130849 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.130859 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:23Z","lastTransitionTime":"2026-03-09T13:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.233402 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.233456 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.233470 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.233487 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.233498 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:23Z","lastTransitionTime":"2026-03-09T13:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.336421 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.336476 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.336501 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.336526 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.336541 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:23Z","lastTransitionTime":"2026-03-09T13:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.439572 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.439612 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.439622 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.439637 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.439664 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:23Z","lastTransitionTime":"2026-03-09T13:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.541221 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.541255 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.541264 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.541277 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.541285 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:23Z","lastTransitionTime":"2026-03-09T13:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.643199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.643231 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.643265 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.643287 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.643297 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:23Z","lastTransitionTime":"2026-03-09T13:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.745266 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.745314 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.745328 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.745344 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.745356 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:23Z","lastTransitionTime":"2026-03-09T13:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.847088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.847118 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.847126 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.847139 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.847147 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:23Z","lastTransitionTime":"2026-03-09T13:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.949438 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.949484 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.949497 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.949515 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.949529 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:23Z","lastTransitionTime":"2026-03-09T13:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.051404 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.051442 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.051457 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.051471 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.051480 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:24Z","lastTransitionTime":"2026-03-09T13:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.154032 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.154261 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.154268 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.154284 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.154302 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:24Z","lastTransitionTime":"2026-03-09T13:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.256233 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.256276 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.256286 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.256303 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.256314 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:24Z","lastTransitionTime":"2026-03-09T13:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.358408 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.358456 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.358465 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.358477 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.358487 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:24Z","lastTransitionTime":"2026-03-09T13:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.461100 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.461164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.461176 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.461193 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.461204 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:24Z","lastTransitionTime":"2026-03-09T13:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.559551 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.559598 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.559626 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:24 crc kubenswrapper[4764]: E0309 13:22:24.559765 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:24 crc kubenswrapper[4764]: E0309 13:22:24.559863 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:24 crc kubenswrapper[4764]: E0309 13:22:24.560020 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.563676 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.563708 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.563718 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.563733 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.563743 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:24Z","lastTransitionTime":"2026-03-09T13:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.665944 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.665992 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.666003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.666020 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.666030 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:24Z","lastTransitionTime":"2026-03-09T13:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.768282 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.768322 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.768337 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.768353 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.768363 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:24Z","lastTransitionTime":"2026-03-09T13:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.870235 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.870285 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.870299 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.870316 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.870328 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:24Z","lastTransitionTime":"2026-03-09T13:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.972841 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.972896 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.972907 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.972926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.972937 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:24Z","lastTransitionTime":"2026-03-09T13:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.075683 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.075764 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.075777 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.075817 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.075829 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:25Z","lastTransitionTime":"2026-03-09T13:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.178682 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.178732 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.178743 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.178760 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.178773 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:25Z","lastTransitionTime":"2026-03-09T13:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.280618 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.280694 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.280709 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.280726 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.280736 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:25Z","lastTransitionTime":"2026-03-09T13:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.383030 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.383066 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.383073 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.383088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.383099 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:25Z","lastTransitionTime":"2026-03-09T13:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.484931 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.484972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.484986 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.485003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.485015 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:25Z","lastTransitionTime":"2026-03-09T13:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.569590 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.578912 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.587477 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.587573 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.587616 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.587629 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.587670 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.587680 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:25Z","lastTransitionTime":"2026-03-09T13:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.598375 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.609590 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.619981 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.636148 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.689394 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.689426 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.689435 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.689447 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.689456 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:25Z","lastTransitionTime":"2026-03-09T13:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.791584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.791633 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.791666 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.791688 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.791703 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:25Z","lastTransitionTime":"2026-03-09T13:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.893629 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.893692 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.893703 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.893718 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.893727 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:25Z","lastTransitionTime":"2026-03-09T13:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.996287 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.996325 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.996334 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.996349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.996360 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:25Z","lastTransitionTime":"2026-03-09T13:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.098962 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.099013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.099025 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.099044 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.099056 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:26Z","lastTransitionTime":"2026-03-09T13:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.201514 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.201558 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.201569 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.201584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.201595 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:26Z","lastTransitionTime":"2026-03-09T13:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.303861 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.303895 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.303904 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.303918 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.303929 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:26Z","lastTransitionTime":"2026-03-09T13:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.406211 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.406252 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.406261 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.406305 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.406315 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:26Z","lastTransitionTime":"2026-03-09T13:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.508121 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.508164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.508178 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.508198 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.508210 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:26Z","lastTransitionTime":"2026-03-09T13:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.559611 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.559702 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.559622 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:26 crc kubenswrapper[4764]: E0309 13:22:26.559754 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:26 crc kubenswrapper[4764]: E0309 13:22:26.559835 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:26 crc kubenswrapper[4764]: E0309 13:22:26.560000 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.610370 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.610615 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.610631 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.610663 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.610674 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:26Z","lastTransitionTime":"2026-03-09T13:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.713243 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.713290 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.713300 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.713314 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.713324 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:26Z","lastTransitionTime":"2026-03-09T13:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.815776 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.815813 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.815820 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.815833 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.815842 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:26Z","lastTransitionTime":"2026-03-09T13:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.918137 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.918193 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.918202 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.918216 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.918225 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:26Z","lastTransitionTime":"2026-03-09T13:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.020702 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.020740 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.020751 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.020766 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.020777 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.123149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.123191 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.123208 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.123224 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.123234 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.225832 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.225865 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.225875 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.225889 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.225900 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.328130 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.328175 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.328191 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.328210 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.328224 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.429999 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.430035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.430045 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.430059 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.430069 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.532519 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.532599 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.532613 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.532630 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.532690 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.635515 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.635569 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.635582 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.635607 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.635622 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.656619 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-r5bnx"] Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.656998 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r5bnx" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.659632 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.659761 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.660030 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.665496 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.674871 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.686555 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.696346 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.707608 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.713450 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.720586 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.727263 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.732469 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7fede188-66d9-4cb1-af19-c94afe7fbcde-hosts-file\") pod \"node-resolver-r5bnx\" (UID: \"7fede188-66d9-4cb1-af19-c94afe7fbcde\") " pod="openshift-dns/node-resolver-r5bnx" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.732538 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjp5c\" (UniqueName: \"kubernetes.io/projected/7fede188-66d9-4cb1-af19-c94afe7fbcde-kube-api-access-pjp5c\") pod \"node-resolver-r5bnx\" (UID: \"7fede188-66d9-4cb1-af19-c94afe7fbcde\") " pod="openshift-dns/node-resolver-r5bnx" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.737830 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.737873 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.737883 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.737897 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.737907 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.833722 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjp5c\" (UniqueName: \"kubernetes.io/projected/7fede188-66d9-4cb1-af19-c94afe7fbcde-kube-api-access-pjp5c\") pod \"node-resolver-r5bnx\" (UID: \"7fede188-66d9-4cb1-af19-c94afe7fbcde\") " pod="openshift-dns/node-resolver-r5bnx" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.833801 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7fede188-66d9-4cb1-af19-c94afe7fbcde-hosts-file\") pod \"node-resolver-r5bnx\" (UID: \"7fede188-66d9-4cb1-af19-c94afe7fbcde\") " pod="openshift-dns/node-resolver-r5bnx" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.833911 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7fede188-66d9-4cb1-af19-c94afe7fbcde-hosts-file\") pod \"node-resolver-r5bnx\" (UID: \"7fede188-66d9-4cb1-af19-c94afe7fbcde\") " pod="openshift-dns/node-resolver-r5bnx" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.840942 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.840983 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.840992 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.841006 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.841015 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.851097 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjp5c\" (UniqueName: \"kubernetes.io/projected/7fede188-66d9-4cb1-af19-c94afe7fbcde-kube-api-access-pjp5c\") pod \"node-resolver-r5bnx\" (UID: \"7fede188-66d9-4cb1-af19-c94afe7fbcde\") " pod="openshift-dns/node-resolver-r5bnx" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.943019 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.943108 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.943126 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.943151 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.943198 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.969125 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r5bnx" Mar 09 13:22:27 crc kubenswrapper[4764]: E0309 13:22:27.984877 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:27 crc kubenswrapper[4764]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 09 13:22:27 crc kubenswrapper[4764]: set -uo pipefail Mar 09 13:22:27 crc kubenswrapper[4764]: Mar 09 13:22:27 crc kubenswrapper[4764]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 09 13:22:27 crc kubenswrapper[4764]: Mar 09 13:22:27 crc kubenswrapper[4764]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 09 13:22:27 crc kubenswrapper[4764]: HOSTS_FILE="/etc/hosts" Mar 09 13:22:27 crc kubenswrapper[4764]: TEMP_FILE="/etc/hosts.tmp" Mar 09 13:22:27 crc kubenswrapper[4764]: Mar 09 13:22:27 crc kubenswrapper[4764]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 09 13:22:27 crc kubenswrapper[4764]: Mar 09 13:22:27 crc kubenswrapper[4764]: # Make a temporary file with the old hosts file's attributes. Mar 09 13:22:27 crc kubenswrapper[4764]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 09 13:22:27 crc kubenswrapper[4764]: echo "Failed to preserve hosts file. Exiting." Mar 09 13:22:27 crc kubenswrapper[4764]: exit 1 Mar 09 13:22:27 crc kubenswrapper[4764]: fi Mar 09 13:22:27 crc kubenswrapper[4764]: Mar 09 13:22:27 crc kubenswrapper[4764]: while true; do Mar 09 13:22:27 crc kubenswrapper[4764]: declare -A svc_ips Mar 09 13:22:27 crc kubenswrapper[4764]: for svc in "${services[@]}"; do Mar 09 13:22:27 crc kubenswrapper[4764]: # Fetch service IP from cluster dns if present. We make several tries Mar 09 13:22:27 crc kubenswrapper[4764]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 09 13:22:27 crc kubenswrapper[4764]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 09 13:22:27 crc kubenswrapper[4764]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 09 13:22:27 crc kubenswrapper[4764]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 09 13:22:27 crc kubenswrapper[4764]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 09 13:22:27 crc kubenswrapper[4764]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 09 13:22:27 crc kubenswrapper[4764]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 09 13:22:27 crc kubenswrapper[4764]: for i in ${!cmds[*]} Mar 09 13:22:27 crc kubenswrapper[4764]: do Mar 09 13:22:27 crc kubenswrapper[4764]: ips=($(eval "${cmds[i]}")) Mar 09 13:22:27 crc kubenswrapper[4764]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 09 13:22:27 crc kubenswrapper[4764]: svc_ips["${svc}"]="${ips[@]}" Mar 09 13:22:27 crc kubenswrapper[4764]: break Mar 09 13:22:27 crc kubenswrapper[4764]: fi Mar 09 13:22:27 crc kubenswrapper[4764]: done Mar 09 13:22:27 crc kubenswrapper[4764]: done Mar 09 13:22:27 crc kubenswrapper[4764]: Mar 09 13:22:27 crc kubenswrapper[4764]: # Update /etc/hosts only if we get valid service IPs Mar 09 13:22:27 crc kubenswrapper[4764]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 09 13:22:27 crc kubenswrapper[4764]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 09 13:22:27 crc kubenswrapper[4764]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 09 13:22:27 crc kubenswrapper[4764]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 09 13:22:27 crc kubenswrapper[4764]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 09 13:22:27 crc kubenswrapper[4764]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 09 13:22:27 crc kubenswrapper[4764]: sleep 60 & wait Mar 09 13:22:27 crc kubenswrapper[4764]: continue Mar 09 13:22:27 crc kubenswrapper[4764]: fi Mar 09 13:22:27 crc kubenswrapper[4764]: Mar 09 13:22:27 crc kubenswrapper[4764]: # Append resolver entries for services Mar 09 13:22:27 crc kubenswrapper[4764]: rc=0 Mar 09 13:22:27 crc kubenswrapper[4764]: for svc in "${!svc_ips[@]}"; do Mar 09 13:22:27 crc kubenswrapper[4764]: for ip in ${svc_ips[${svc}]}; do Mar 09 13:22:27 crc kubenswrapper[4764]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 09 13:22:27 crc kubenswrapper[4764]: done Mar 09 13:22:27 crc kubenswrapper[4764]: done Mar 09 13:22:27 crc kubenswrapper[4764]: if [[ $rc -ne 0 ]]; then Mar 09 13:22:27 crc kubenswrapper[4764]: sleep 60 & wait Mar 09 13:22:27 crc kubenswrapper[4764]: continue Mar 09 13:22:27 crc kubenswrapper[4764]: fi Mar 09 13:22:27 crc kubenswrapper[4764]: Mar 09 13:22:27 crc kubenswrapper[4764]: Mar 09 13:22:27 crc kubenswrapper[4764]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 09 13:22:27 crc kubenswrapper[4764]: # Replace /etc/hosts with our modified version if needed Mar 09 13:22:27 crc kubenswrapper[4764]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 09 13:22:27 crc kubenswrapper[4764]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 09 13:22:27 crc kubenswrapper[4764]: fi Mar 09 13:22:27 crc kubenswrapper[4764]: sleep 60 & wait Mar 09 13:22:27 crc kubenswrapper[4764]: unset svc_ips Mar 09 13:22:27 crc kubenswrapper[4764]: done Mar 09 13:22:27 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pjp5c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-r5bnx_openshift-dns(7fede188-66d9-4cb1-af19-c94afe7fbcde): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:27 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:27 crc kubenswrapper[4764]: E0309 13:22:27.985950 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-r5bnx" podUID="7fede188-66d9-4cb1-af19-c94afe7fbcde" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.023866 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-xxczl"] Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.024160 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-zmzm7"] Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.024356 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.024514 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.024851 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-crvdf"] Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.025716 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.030238 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.030573 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.030825 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.031094 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.031208 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.031372 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.031775 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.031864 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.031901 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.031910 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.033838 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.035754 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.046124 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.046156 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.046165 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.046178 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.046206 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:28Z","lastTransitionTime":"2026-03-09T13:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.048172 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.088076 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.111441 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.119850 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.129199 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135132 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-cnibin\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135163 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk5pb\" (UniqueName: \"kubernetes.io/projected/202a1f58-ce83-4374-ac48-dc806f7b9d6b-kube-api-access-rk5pb\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135182 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-cni-binary-copy\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135198 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135213 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-cnibin\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135228 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-run-netns\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135244 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-hostroot\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135259 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6bcdd179-43c2-427c-9fac-7155c122e922-mcd-auth-proxy-config\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135272 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25jms\" (UniqueName: \"kubernetes.io/projected/6bcdd179-43c2-427c-9fac-7155c122e922-kube-api-access-25jms\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135302 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6bcdd179-43c2-427c-9fac-7155c122e922-rootfs\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135317 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-var-lib-cni-multus\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135331 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-daemon-config\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135347 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-etc-kubernetes\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135371 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-run-multus-certs\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135386 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-os-release\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135498 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-os-release\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135541 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/202a1f58-ce83-4374-ac48-dc806f7b9d6b-cni-binary-copy\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135666 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-run-k8s-cni-cncf-io\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135694 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-system-cni-dir\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135828 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6bcdd179-43c2-427c-9fac-7155c122e922-proxy-tls\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135853 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-socket-dir-parent\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135914 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-system-cni-dir\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135939 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.136056 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-conf-dir\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.136095 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjrkl\" (UniqueName: \"kubernetes.io/projected/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-kube-api-access-qjrkl\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.136138 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-cni-dir\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.136158 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-var-lib-kubelet\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.136255 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-var-lib-cni-bin\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.136553 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.144993 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.148032 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.148056 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.148066 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.148079 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.148089 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:28Z","lastTransitionTime":"2026-03-09T13:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.152163 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.160946 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.169245 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.178412 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.184761 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.194157 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.206004 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.213844 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.222102 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.230561 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237122 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6bcdd179-43c2-427c-9fac-7155c122e922-mcd-auth-proxy-config\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237188 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25jms\" (UniqueName: \"kubernetes.io/projected/6bcdd179-43c2-427c-9fac-7155c122e922-kube-api-access-25jms\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237211 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-run-netns\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237250 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-hostroot\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237285 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-var-lib-cni-multus\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237307 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-daemon-config\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237357 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6bcdd179-43c2-427c-9fac-7155c122e922-rootfs\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237379 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-run-multus-certs\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237384 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-var-lib-cni-multus\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237391 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-hostroot\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237421 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-etc-kubernetes\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237446 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/202a1f58-ce83-4374-ac48-dc806f7b9d6b-cni-binary-copy\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237462 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6bcdd179-43c2-427c-9fac-7155c122e922-rootfs\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237466 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-os-release\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237497 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-run-multus-certs\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237346 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-run-netns\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237513 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-os-release\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237557 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6bcdd179-43c2-427c-9fac-7155c122e922-proxy-tls\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237576 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-socket-dir-parent\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237593 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-run-k8s-cni-cncf-io\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237610 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-system-cni-dir\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237627 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-system-cni-dir\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237658 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237609 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-os-release\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237688 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-conf-dir\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237713 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-etc-kubernetes\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237728 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjrkl\" (UniqueName: \"kubernetes.io/projected/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-kube-api-access-qjrkl\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237768 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-cni-dir\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237783 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6bcdd179-43c2-427c-9fac-7155c122e922-mcd-auth-proxy-config\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237801 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-run-k8s-cni-cncf-io\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237836 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-system-cni-dir\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237791 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-var-lib-kubelet\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237855 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-system-cni-dir\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237842 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-var-lib-kubelet\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237906 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-conf-dir\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237922 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-socket-dir-parent\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237950 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-cni-dir\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237948 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-os-release\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238020 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-var-lib-cni-bin\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238073 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-var-lib-cni-bin\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238084 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk5pb\" (UniqueName: \"kubernetes.io/projected/202a1f58-ce83-4374-ac48-dc806f7b9d6b-kube-api-access-rk5pb\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238110 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-cnibin\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238170 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-daemon-config\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238179 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238186 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-cnibin\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238204 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-cnibin\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238249 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-cni-binary-copy\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238249 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-cnibin\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238393 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238581 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/202a1f58-ce83-4374-ac48-dc806f7b9d6b-cni-binary-copy\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238720 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238889 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-cni-binary-copy\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.240096 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.240349 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6bcdd179-43c2-427c-9fac-7155c122e922-proxy-tls\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.249353 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.250554 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.250584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.250593 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.250609 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.250620 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:28Z","lastTransitionTime":"2026-03-09T13:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.253816 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjrkl\" (UniqueName: \"kubernetes.io/projected/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-kube-api-access-qjrkl\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.254025 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25jms\" (UniqueName: \"kubernetes.io/projected/6bcdd179-43c2-427c-9fac-7155c122e922-kube-api-access-25jms\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.258539 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk5pb\" (UniqueName: \"kubernetes.io/projected/202a1f58-ce83-4374-ac48-dc806f7b9d6b-kube-api-access-rk5pb\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.262625 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.271765 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.345961 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.352038 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.352082 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.352095 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.352112 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.352123 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:28Z","lastTransitionTime":"2026-03-09T13:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:28 crc kubenswrapper[4764]: W0309 13:22:28.355322 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod202a1f58_ce83_4374_ac48_dc806f7b9d6b.slice/crio-a445c4d4ff8c0e7680f3c86119feb92a65fc69ae77f288e56f271709f496c0ac WatchSource:0}: Error finding container a445c4d4ff8c0e7680f3c86119feb92a65fc69ae77f288e56f271709f496c0ac: Status 404 returned error can't find the container with id a445c4d4ff8c0e7680f3c86119feb92a65fc69ae77f288e56f271709f496c0ac Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.357397 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:28 crc kubenswrapper[4764]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 09 13:22:28 crc kubenswrapper[4764]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 09 13:22:28 crc kubenswrapper[4764]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rk5pb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-zmzm7_openshift-multus(202a1f58-ce83-4374-ac48-dc806f7b9d6b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:28 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.358600 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-zmzm7" podUID="202a1f58-ce83-4374-ac48-dc806f7b9d6b" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.369254 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.375233 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: W0309 13:22:28.381460 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bcdd179_43c2_427c_9fac_7155c122e922.slice/crio-53e57fc2015ccb3794b104e1fb3b1e815a6b724ebb5c7cdb477b30d5440dd8ad WatchSource:0}: Error finding container 53e57fc2015ccb3794b104e1fb3b1e815a6b724ebb5c7cdb477b30d5440dd8ad: Status 404 returned error can't find the container with id 53e57fc2015ccb3794b104e1fb3b1e815a6b724ebb5c7cdb477b30d5440dd8ad Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.383252 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-25jms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.388304 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-25jms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: W0309 13:22:28.389470 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod072442e6_8ece_4f72_a8cb_ad7ef1e3facb.slice/crio-40f63e24da2ee3c429a77ccc1ab25e68446cce3675cd0f629706236630ebbef0 WatchSource:0}: Error finding container 40f63e24da2ee3c429a77ccc1ab25e68446cce3675cd0f629706236630ebbef0: Status 404 returned error can't find the container with id 40f63e24da2ee3c429a77ccc1ab25e68446cce3675cd0f629706236630ebbef0 Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.389522 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.391565 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qjrkl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-crvdf_openshift-multus(072442e6-8ece-4f72-a8cb-ad7ef1e3facb): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.394841 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-crvdf" podUID="072442e6-8ece-4f72-a8cb-ad7ef1e3facb" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.404167 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7kggv"] Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.405083 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.408166 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.408184 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.408936 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.409080 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.409130 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.410365 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.410529 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.417347 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.427017 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.435818 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.444846 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.453502 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.455354 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.455385 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.455402 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.455436 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.455450 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:28Z","lastTransitionTime":"2026-03-09T13:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.462298 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.471125 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.479972 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.486084 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.494139 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.504394 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.517589 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.540814 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-ovn\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.540847 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.540868 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-env-overrides\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.540882 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-slash\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.540896 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-systemd\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.540910 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-log-socket\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.540923 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-ovn-kubernetes\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.540938 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-bin\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541043 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-netns\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541114 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-openvswitch\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541268 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovn-node-metrics-cert\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541381 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-etc-openvswitch\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541416 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-script-lib\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541442 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-systemd-units\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541476 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-config\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541498 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5xrv\" (UniqueName: \"kubernetes.io/projected/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-kube-api-access-r5xrv\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541549 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-var-lib-openvswitch\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541706 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-node-log\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541733 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-netd\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541763 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-kubelet\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.557445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.557500 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.557510 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.557526 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.557539 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:28Z","lastTransitionTime":"2026-03-09T13:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.558692 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.558704 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.558864 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.559021 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.559277 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.559352 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.560194 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:28 crc kubenswrapper[4764]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 09 13:22:28 crc kubenswrapper[4764]: if [[ -f "/env/_master" ]]; then Mar 09 13:22:28 crc kubenswrapper[4764]: set -o allexport Mar 09 13:22:28 crc kubenswrapper[4764]: source "/env/_master" Mar 09 13:22:28 crc kubenswrapper[4764]: set +o allexport Mar 09 13:22:28 crc kubenswrapper[4764]: fi Mar 09 13:22:28 crc kubenswrapper[4764]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 09 13:22:28 crc kubenswrapper[4764]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 09 13:22:28 crc kubenswrapper[4764]: ho_enable="--enable-hybrid-overlay" Mar 09 13:22:28 crc kubenswrapper[4764]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 09 13:22:28 crc kubenswrapper[4764]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 09 13:22:28 crc kubenswrapper[4764]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 09 13:22:28 crc kubenswrapper[4764]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 09 13:22:28 crc kubenswrapper[4764]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 09 13:22:28 crc kubenswrapper[4764]: --webhook-host=127.0.0.1 \ Mar 09 13:22:28 crc kubenswrapper[4764]: --webhook-port=9743 \ Mar 09 13:22:28 crc kubenswrapper[4764]: ${ho_enable} \ Mar 09 13:22:28 crc kubenswrapper[4764]: --enable-interconnect \ Mar 09 13:22:28 crc kubenswrapper[4764]: --disable-approver \ Mar 09 13:22:28 crc kubenswrapper[4764]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 09 13:22:28 crc kubenswrapper[4764]: --wait-for-kubernetes-api=200s \ Mar 09 13:22:28 crc kubenswrapper[4764]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 09 13:22:28 crc kubenswrapper[4764]: --loglevel="${LOGLEVEL}" Mar 09 13:22:28 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:28 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.562313 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:28 crc kubenswrapper[4764]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 09 13:22:28 crc kubenswrapper[4764]: if [[ -f "/env/_master" ]]; then Mar 09 13:22:28 crc kubenswrapper[4764]: set -o allexport Mar 09 13:22:28 crc kubenswrapper[4764]: source "/env/_master" Mar 09 13:22:28 crc kubenswrapper[4764]: set +o allexport Mar 09 13:22:28 crc kubenswrapper[4764]: fi Mar 09 13:22:28 crc kubenswrapper[4764]: Mar 09 13:22:28 crc kubenswrapper[4764]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 09 13:22:28 crc kubenswrapper[4764]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 09 13:22:28 crc kubenswrapper[4764]: --disable-webhook \ Mar 09 13:22:28 crc kubenswrapper[4764]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 09 13:22:28 crc kubenswrapper[4764]: --loglevel="${LOGLEVEL}" Mar 09 13:22:28 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:28 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.563449 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642712 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-kubelet\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642750 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-ovn\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642771 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642786 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-env-overrides\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642801 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-slash\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642815 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-systemd\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642829 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-log-socket\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642843 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-ovn-kubernetes\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642858 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-bin\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642887 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-netns\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642902 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-openvswitch\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642924 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovn-node-metrics-cert\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642973 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-etc-openvswitch\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642988 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-script-lib\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643001 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-systemd-units\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643025 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-config\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643041 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5xrv\" (UniqueName: \"kubernetes.io/projected/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-kube-api-access-r5xrv\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643084 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-var-lib-openvswitch\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643099 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-node-log\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643114 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-netd\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643171 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-netd\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643205 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-kubelet\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643226 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-ovn\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643245 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643823 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-etc-openvswitch\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643868 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-bin\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643884 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-netns\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643915 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-openvswitch\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643915 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-slash\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643948 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-env-overrides\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643958 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-systemd\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643978 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-log-socket\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.644007 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-ovn-kubernetes\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.644291 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-systemd-units\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.644599 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-var-lib-openvswitch\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.644634 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-node-log\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.644858 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-config\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.645199 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-script-lib\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.646999 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovn-node-metrics-cert\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.659589 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.659656 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.659675 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.659692 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.659701 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:28Z","lastTransitionTime":"2026-03-09T13:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.660833 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5xrv\" (UniqueName: \"kubernetes.io/projected/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-kube-api-access-r5xrv\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.721838 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: W0309 13:22:28.733189 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8ccb4f5_550a_41b2_b39d_201cdd5d902a.slice/crio-65054061a375abf79424d7095dced33c492d7e93ee2f22f668abd5a0c5ced956 WatchSource:0}: Error finding container 65054061a375abf79424d7095dced33c492d7e93ee2f22f668abd5a0c5ced956: Status 404 returned error can't find the container with id 65054061a375abf79424d7095dced33c492d7e93ee2f22f668abd5a0c5ced956 Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.735304 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:28 crc kubenswrapper[4764]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 09 13:22:28 crc kubenswrapper[4764]: apiVersion: v1 Mar 09 13:22:28 crc kubenswrapper[4764]: clusters: Mar 09 13:22:28 crc kubenswrapper[4764]: - cluster: Mar 09 13:22:28 crc kubenswrapper[4764]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 09 13:22:28 crc kubenswrapper[4764]: server: https://api-int.crc.testing:6443 Mar 09 13:22:28 crc kubenswrapper[4764]: name: default-cluster Mar 09 13:22:28 crc kubenswrapper[4764]: contexts: Mar 09 13:22:28 crc kubenswrapper[4764]: - context: Mar 09 13:22:28 crc kubenswrapper[4764]: cluster: default-cluster Mar 09 13:22:28 crc kubenswrapper[4764]: namespace: default Mar 09 13:22:28 crc kubenswrapper[4764]: user: default-auth Mar 09 13:22:28 crc kubenswrapper[4764]: name: default-context Mar 09 13:22:28 crc kubenswrapper[4764]: current-context: default-context Mar 09 13:22:28 crc kubenswrapper[4764]: kind: Config Mar 09 13:22:28 crc kubenswrapper[4764]: preferences: {} Mar 09 13:22:28 crc kubenswrapper[4764]: users: Mar 09 13:22:28 crc kubenswrapper[4764]: - name: default-auth Mar 09 13:22:28 crc kubenswrapper[4764]: user: Mar 09 13:22:28 crc kubenswrapper[4764]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 09 13:22:28 crc kubenswrapper[4764]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 09 13:22:28 crc kubenswrapper[4764]: EOF Mar 09 13:22:28 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r5xrv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:28 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.736472 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.740359 4764 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.761905 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.761944 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.761958 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.761975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.761986 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:28Z","lastTransitionTime":"2026-03-09T13:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.864709 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.864771 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.864786 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.864804 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.864817 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:28Z","lastTransitionTime":"2026-03-09T13:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.901185 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"53e57fc2015ccb3794b104e1fb3b1e815a6b724ebb5c7cdb477b30d5440dd8ad"} Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.902153 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r5bnx" event={"ID":"7fede188-66d9-4cb1-af19-c94afe7fbcde","Type":"ContainerStarted","Data":"78819a166f081d16d2602205a32b059426bbf5489e3535ca772ed8601874dbea"} Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.902925 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-25jms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.903469 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" event={"ID":"072442e6-8ece-4f72-a8cb-ad7ef1e3facb","Type":"ContainerStarted","Data":"40f63e24da2ee3c429a77ccc1ab25e68446cce3675cd0f629706236630ebbef0"} Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.904730 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qjrkl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-crvdf_openshift-multus(072442e6-8ece-4f72-a8cb-ad7ef1e3facb): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.905177 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-25jms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.905822 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-crvdf" podUID="072442e6-8ece-4f72-a8cb-ad7ef1e3facb" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.906256 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.906430 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmzm7" event={"ID":"202a1f58-ce83-4374-ac48-dc806f7b9d6b","Type":"ContainerStarted","Data":"a445c4d4ff8c0e7680f3c86119feb92a65fc69ae77f288e56f271709f496c0ac"} Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.906836 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:28 crc kubenswrapper[4764]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 09 13:22:28 crc kubenswrapper[4764]: set -uo pipefail Mar 09 13:22:28 crc kubenswrapper[4764]: Mar 09 13:22:28 crc kubenswrapper[4764]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 09 13:22:28 crc kubenswrapper[4764]: Mar 09 13:22:28 crc kubenswrapper[4764]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 09 13:22:28 crc kubenswrapper[4764]: HOSTS_FILE="/etc/hosts" Mar 09 13:22:28 crc kubenswrapper[4764]: TEMP_FILE="/etc/hosts.tmp" Mar 09 13:22:28 crc kubenswrapper[4764]: Mar 09 13:22:28 crc kubenswrapper[4764]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 09 13:22:28 crc kubenswrapper[4764]: Mar 09 13:22:28 crc kubenswrapper[4764]: # Make a temporary file with the old hosts file's attributes. Mar 09 13:22:28 crc kubenswrapper[4764]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 09 13:22:28 crc kubenswrapper[4764]: echo "Failed to preserve hosts file. Exiting." Mar 09 13:22:28 crc kubenswrapper[4764]: exit 1 Mar 09 13:22:28 crc kubenswrapper[4764]: fi Mar 09 13:22:28 crc kubenswrapper[4764]: Mar 09 13:22:28 crc kubenswrapper[4764]: while true; do Mar 09 13:22:28 crc kubenswrapper[4764]: declare -A svc_ips Mar 09 13:22:28 crc kubenswrapper[4764]: for svc in "${services[@]}"; do Mar 09 13:22:28 crc kubenswrapper[4764]: # Fetch service IP from cluster dns if present. We make several tries Mar 09 13:22:28 crc kubenswrapper[4764]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 09 13:22:28 crc kubenswrapper[4764]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 09 13:22:28 crc kubenswrapper[4764]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 09 13:22:28 crc kubenswrapper[4764]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 09 13:22:28 crc kubenswrapper[4764]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 09 13:22:28 crc kubenswrapper[4764]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 09 13:22:28 crc kubenswrapper[4764]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 09 13:22:28 crc kubenswrapper[4764]: for i in ${!cmds[*]} Mar 09 13:22:28 crc kubenswrapper[4764]: do Mar 09 13:22:28 crc kubenswrapper[4764]: ips=($(eval "${cmds[i]}")) Mar 09 13:22:28 crc kubenswrapper[4764]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 09 13:22:28 crc kubenswrapper[4764]: svc_ips["${svc}"]="${ips[@]}" Mar 09 13:22:28 crc kubenswrapper[4764]: break Mar 09 13:22:28 crc kubenswrapper[4764]: fi Mar 09 13:22:28 crc kubenswrapper[4764]: done Mar 09 13:22:28 crc kubenswrapper[4764]: done Mar 09 13:22:28 crc kubenswrapper[4764]: Mar 09 13:22:28 crc kubenswrapper[4764]: # Update /etc/hosts only if we get valid service IPs Mar 09 13:22:28 crc kubenswrapper[4764]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 09 13:22:28 crc kubenswrapper[4764]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 09 13:22:28 crc kubenswrapper[4764]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 09 13:22:28 crc kubenswrapper[4764]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 09 13:22:28 crc kubenswrapper[4764]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 09 13:22:28 crc kubenswrapper[4764]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 09 13:22:28 crc kubenswrapper[4764]: sleep 60 & wait Mar 09 13:22:28 crc kubenswrapper[4764]: continue Mar 09 13:22:28 crc kubenswrapper[4764]: fi Mar 09 13:22:28 crc kubenswrapper[4764]: Mar 09 13:22:28 crc kubenswrapper[4764]: # Append resolver entries for services Mar 09 13:22:28 crc kubenswrapper[4764]: rc=0 Mar 09 13:22:28 crc kubenswrapper[4764]: for svc in "${!svc_ips[@]}"; do Mar 09 13:22:28 crc kubenswrapper[4764]: for ip in ${svc_ips[${svc}]}; do Mar 09 13:22:28 crc kubenswrapper[4764]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 09 13:22:28 crc kubenswrapper[4764]: done Mar 09 13:22:28 crc kubenswrapper[4764]: done Mar 09 13:22:28 crc kubenswrapper[4764]: if [[ $rc -ne 0 ]]; then Mar 09 13:22:28 crc kubenswrapper[4764]: sleep 60 & wait Mar 09 13:22:28 crc kubenswrapper[4764]: continue Mar 09 13:22:28 crc kubenswrapper[4764]: fi Mar 09 13:22:28 crc kubenswrapper[4764]: Mar 09 13:22:28 crc kubenswrapper[4764]: Mar 09 13:22:28 crc kubenswrapper[4764]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 09 13:22:28 crc kubenswrapper[4764]: # Replace /etc/hosts with our modified version if needed Mar 09 13:22:28 crc kubenswrapper[4764]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 09 13:22:28 crc kubenswrapper[4764]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 09 13:22:28 crc kubenswrapper[4764]: fi Mar 09 13:22:28 crc kubenswrapper[4764]: sleep 60 & wait Mar 09 13:22:28 crc kubenswrapper[4764]: unset svc_ips Mar 09 13:22:28 crc kubenswrapper[4764]: done Mar 09 13:22:28 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pjp5c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-r5bnx_openshift-dns(7fede188-66d9-4cb1-af19-c94afe7fbcde): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:28 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.907368 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"65054061a375abf79424d7095dced33c492d7e93ee2f22f668abd5a0c5ced956"} Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.907915 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:28 crc kubenswrapper[4764]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 09 13:22:28 crc kubenswrapper[4764]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 09 13:22:28 crc kubenswrapper[4764]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rk5pb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-zmzm7_openshift-multus(202a1f58-ce83-4374-ac48-dc806f7b9d6b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:28 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.910178 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-r5bnx" podUID="7fede188-66d9-4cb1-af19-c94afe7fbcde" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.910335 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-zmzm7" podUID="202a1f58-ce83-4374-ac48-dc806f7b9d6b" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.911135 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:28 crc kubenswrapper[4764]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 09 13:22:28 crc kubenswrapper[4764]: apiVersion: v1 Mar 09 13:22:28 crc kubenswrapper[4764]: clusters: Mar 09 13:22:28 crc kubenswrapper[4764]: - cluster: Mar 09 13:22:28 crc kubenswrapper[4764]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 09 13:22:28 crc kubenswrapper[4764]: server: https://api-int.crc.testing:6443 Mar 09 13:22:28 crc kubenswrapper[4764]: name: default-cluster Mar 09 13:22:28 crc kubenswrapper[4764]: contexts: Mar 09 13:22:28 crc kubenswrapper[4764]: - context: Mar 09 13:22:28 crc kubenswrapper[4764]: cluster: default-cluster Mar 09 13:22:28 crc kubenswrapper[4764]: namespace: default Mar 09 13:22:28 crc kubenswrapper[4764]: user: default-auth Mar 09 13:22:28 crc kubenswrapper[4764]: name: default-context Mar 09 13:22:28 crc kubenswrapper[4764]: current-context: default-context Mar 09 13:22:28 crc kubenswrapper[4764]: kind: Config Mar 09 13:22:28 crc kubenswrapper[4764]: preferences: {} Mar 09 13:22:28 crc kubenswrapper[4764]: users: Mar 09 13:22:28 crc kubenswrapper[4764]: - name: default-auth Mar 09 13:22:28 crc kubenswrapper[4764]: user: Mar 09 13:22:28 crc kubenswrapper[4764]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 09 13:22:28 crc kubenswrapper[4764]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 09 13:22:28 crc kubenswrapper[4764]: EOF Mar 09 13:22:28 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r5xrv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:28 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.912237 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.912438 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.921911 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.932031 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.943334 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.953750 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.962608 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.966840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.966871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.966881 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.966895 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.966905 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:28Z","lastTransitionTime":"2026-03-09T13:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.972693 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.980043 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.988685 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.999270 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.013381 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.022298 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.029802 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.037607 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.045615 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.053910 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.060993 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.069830 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.069877 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.069889 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.069909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.069922 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:29Z","lastTransitionTime":"2026-03-09T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.070472 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.081035 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.087891 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.095796 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.105874 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.119991 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.128752 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.172540 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.172605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.172620 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.172675 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.172699 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:29Z","lastTransitionTime":"2026-03-09T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.274785 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.274842 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.274860 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.274882 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.274898 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:29Z","lastTransitionTime":"2026-03-09T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.378110 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.378161 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.378176 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.378199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.378225 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:29Z","lastTransitionTime":"2026-03-09T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.482042 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.482086 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.482097 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.482115 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.482127 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:29Z","lastTransitionTime":"2026-03-09T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:29 crc kubenswrapper[4764]: E0309 13:22:29.561572 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 09 13:22:29 crc kubenswrapper[4764]: E0309 13:22:29.562861 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.584873 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.584935 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.584948 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.584973 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.584986 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:29Z","lastTransitionTime":"2026-03-09T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.688737 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.688796 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.688808 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.688829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.688842 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:29Z","lastTransitionTime":"2026-03-09T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.791377 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.791668 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.791784 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.791852 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.791908 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:29Z","lastTransitionTime":"2026-03-09T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.895656 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.895908 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.896013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.896130 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.896214 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:29Z","lastTransitionTime":"2026-03-09T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.998932 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.998990 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.999003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.999021 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.999035 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:29Z","lastTransitionTime":"2026-03-09T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.102657 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.102706 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.102720 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.102739 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.102749 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:30Z","lastTransitionTime":"2026-03-09T13:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.205211 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.205255 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.205264 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.205278 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.205288 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:30Z","lastTransitionTime":"2026-03-09T13:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.308451 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.308498 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.308509 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.308529 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.308539 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:30Z","lastTransitionTime":"2026-03-09T13:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.361489 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:22:46.361460036 +0000 UTC m=+121.611631954 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.361544 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.361788 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.361849 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.361886 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.361924 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362038 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362098 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:46.362087712 +0000 UTC m=+121.612259630 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362265 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362291 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362307 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362350 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:46.362339929 +0000 UTC m=+121.612511847 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362427 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362467 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:46.362456702 +0000 UTC m=+121.612628620 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362551 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362573 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362584 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362615 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:46.362606746 +0000 UTC m=+121.612778664 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.413383 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.413420 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.413429 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.413446 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.413455 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:30Z","lastTransitionTime":"2026-03-09T13:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.516199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.516761 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.516837 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.516929 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.517005 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:30Z","lastTransitionTime":"2026-03-09T13:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.558807 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.558953 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.559010 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.559138 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.559227 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.559352 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.560307 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:30 crc kubenswrapper[4764]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 09 13:22:30 crc kubenswrapper[4764]: set -o allexport Mar 09 13:22:30 crc kubenswrapper[4764]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 09 13:22:30 crc kubenswrapper[4764]: source /etc/kubernetes/apiserver-url.env Mar 09 13:22:30 crc kubenswrapper[4764]: else Mar 09 13:22:30 crc kubenswrapper[4764]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 09 13:22:30 crc kubenswrapper[4764]: exit 1 Mar 09 13:22:30 crc kubenswrapper[4764]: fi Mar 09 13:22:30 crc kubenswrapper[4764]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 09 13:22:30 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:30 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.561475 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.621305 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.621347 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.621357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.621372 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.621382 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:30Z","lastTransitionTime":"2026-03-09T13:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.723559 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.723592 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.723603 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.723616 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.723625 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:30Z","lastTransitionTime":"2026-03-09T13:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.826450 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.826494 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.826504 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.826524 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.826534 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:30Z","lastTransitionTime":"2026-03-09T13:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.928879 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.928933 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.928944 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.928958 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.928986 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:30Z","lastTransitionTime":"2026-03-09T13:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.031756 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.031797 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.031808 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.031823 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.031834 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:31Z","lastTransitionTime":"2026-03-09T13:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.135564 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.135681 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.135709 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.135738 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.135763 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:31Z","lastTransitionTime":"2026-03-09T13:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.238862 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.238926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.238936 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.238952 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.238962 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:31Z","lastTransitionTime":"2026-03-09T13:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.342633 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.342695 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.342704 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.342720 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.342729 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:31Z","lastTransitionTime":"2026-03-09T13:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.445775 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.446331 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.446399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.446514 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.446579 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:31Z","lastTransitionTime":"2026-03-09T13:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.550129 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.550186 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.550199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.550219 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.550234 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:31Z","lastTransitionTime":"2026-03-09T13:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.653725 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.653781 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.653792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.653816 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.653831 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:31Z","lastTransitionTime":"2026-03-09T13:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.758028 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.758094 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.758116 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.758144 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.758167 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:31Z","lastTransitionTime":"2026-03-09T13:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.861328 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.861368 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.861376 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.861391 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.861400 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:31Z","lastTransitionTime":"2026-03-09T13:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.965879 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.965978 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.966014 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.966042 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.966062 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:31Z","lastTransitionTime":"2026-03-09T13:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.070848 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.070925 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.070949 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.070979 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.071002 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.174214 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.174269 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.174283 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.174303 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.174318 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.277451 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.277527 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.277550 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.277581 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.277603 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.379556 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.379590 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.379598 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.379614 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.379625 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.482067 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.482109 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.482118 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.482133 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.482141 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.559075 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.559124 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:32 crc kubenswrapper[4764]: E0309 13:22:32.559219 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.559080 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:32 crc kubenswrapper[4764]: E0309 13:22:32.559491 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:32 crc kubenswrapper[4764]: E0309 13:22:32.559881 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.559927 4764 scope.go:117] "RemoveContainer" containerID="033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.584942 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.584982 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.584995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.585011 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.585022 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.687177 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.687781 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.687804 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.687860 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.687876 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.790930 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.790962 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.790972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.790985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.790997 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.892985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.893027 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.893038 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.893056 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.893068 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.901257 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.901282 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.901290 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.901300 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.901307 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: E0309 13:22:32.912332 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.915481 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.915511 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.915522 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.915570 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.915587 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: E0309 13:22:32.923606 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.925694 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.926437 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.926467 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.926478 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.926493 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.926503 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.927694 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.928034 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:22:32 crc kubenswrapper[4764]: E0309 13:22:32.935698 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.939172 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.939201 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.939211 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.939226 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.939250 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.943538 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: E0309 13:22:32.949810 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.951885 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.952936 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.952971 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.952981 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.952995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.953007 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: E0309 13:22:32.960370 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: E0309 13:22:32.960475 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.965697 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.973689 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.980516 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.989839 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.995779 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.995821 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.995831 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.995846 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.995856 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.999800 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.006874 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.016586 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.027394 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.040587 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.048271 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.098164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.098204 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.098216 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.098231 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.098242 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:33Z","lastTransitionTime":"2026-03-09T13:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.200113 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.200160 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.200173 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.200189 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.200202 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:33Z","lastTransitionTime":"2026-03-09T13:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.302077 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.302116 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.302125 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.302138 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.302151 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:33Z","lastTransitionTime":"2026-03-09T13:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.404898 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.404942 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.404956 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.404977 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.404993 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:33Z","lastTransitionTime":"2026-03-09T13:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.507033 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.507075 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.507088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.507103 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.507112 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:33Z","lastTransitionTime":"2026-03-09T13:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.609716 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.609760 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.609773 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.609788 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.609800 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:33Z","lastTransitionTime":"2026-03-09T13:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.712235 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.712270 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.712279 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.712292 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.712305 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:33Z","lastTransitionTime":"2026-03-09T13:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.821038 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.821092 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.821103 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.821124 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.821137 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:33Z","lastTransitionTime":"2026-03-09T13:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.926599 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.926687 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.926724 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.926759 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.926774 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:33Z","lastTransitionTime":"2026-03-09T13:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.028695 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.028751 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.028760 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.028773 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.028782 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:34Z","lastTransitionTime":"2026-03-09T13:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.081204 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-hbmjc"] Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.081609 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.083873 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.084055 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.084159 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.084472 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.096876 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.108627 4764 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.110175 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.126551 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.131046 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.131072 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.131080 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.131093 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.131102 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:34Z","lastTransitionTime":"2026-03-09T13:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.135014 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.146919 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.157192 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.170547 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.190267 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.200332 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c2b6620-c8b8-47cf-9f15-883dbf8e34cc-host\") pod \"node-ca-hbmjc\" (UID: \"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\") " pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.200609 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4sdq\" (UniqueName: \"kubernetes.io/projected/7c2b6620-c8b8-47cf-9f15-883dbf8e34cc-kube-api-access-z4sdq\") pod \"node-ca-hbmjc\" (UID: \"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\") " pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.200733 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7c2b6620-c8b8-47cf-9f15-883dbf8e34cc-serviceca\") pod \"node-ca-hbmjc\" (UID: \"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\") " pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.204729 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.226318 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.233570 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.233706 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.233792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.233869 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.233932 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:34Z","lastTransitionTime":"2026-03-09T13:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.235848 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.246182 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.256131 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.301754 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4sdq\" (UniqueName: \"kubernetes.io/projected/7c2b6620-c8b8-47cf-9f15-883dbf8e34cc-kube-api-access-z4sdq\") pod \"node-ca-hbmjc\" (UID: \"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\") " pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.301807 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7c2b6620-c8b8-47cf-9f15-883dbf8e34cc-serviceca\") pod \"node-ca-hbmjc\" (UID: \"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\") " pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.301852 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c2b6620-c8b8-47cf-9f15-883dbf8e34cc-host\") pod \"node-ca-hbmjc\" (UID: \"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\") " pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.301921 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c2b6620-c8b8-47cf-9f15-883dbf8e34cc-host\") pod \"node-ca-hbmjc\" (UID: \"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\") " pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.303078 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7c2b6620-c8b8-47cf-9f15-883dbf8e34cc-serviceca\") pod \"node-ca-hbmjc\" (UID: \"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\") " pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.319951 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4sdq\" (UniqueName: \"kubernetes.io/projected/7c2b6620-c8b8-47cf-9f15-883dbf8e34cc-kube-api-access-z4sdq\") pod \"node-ca-hbmjc\" (UID: \"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\") " pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.336730 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.336773 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.336782 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.336799 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.336808 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:34Z","lastTransitionTime":"2026-03-09T13:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.393806 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.439985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.440031 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.440043 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.440059 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.440069 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:34Z","lastTransitionTime":"2026-03-09T13:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.542758 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.542787 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.542796 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.542809 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.542817 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:34Z","lastTransitionTime":"2026-03-09T13:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.559691 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.559739 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.559737 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:34 crc kubenswrapper[4764]: E0309 13:22:34.559806 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:34 crc kubenswrapper[4764]: E0309 13:22:34.560074 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:34 crc kubenswrapper[4764]: E0309 13:22:34.560177 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.645486 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.645520 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.645527 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.645539 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.645548 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:34Z","lastTransitionTime":"2026-03-09T13:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.748149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.748205 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.748217 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.748235 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.748245 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:34Z","lastTransitionTime":"2026-03-09T13:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.850909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.850971 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.850984 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.851002 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.851015 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:34Z","lastTransitionTime":"2026-03-09T13:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.934440 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hbmjc" event={"ID":"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc","Type":"ContainerStarted","Data":"6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.934496 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hbmjc" event={"ID":"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc","Type":"ContainerStarted","Data":"583d7339a6ff7aca2cee97b44d5ae02c2ed0d21a333b208ac485f770c90332e8"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.948009 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.956298 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.956368 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.956383 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.956403 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.956422 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:34Z","lastTransitionTime":"2026-03-09T13:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.967306 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.981288 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.994449 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.008952 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.016968 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.028237 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.041581 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.056600 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.058999 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.059049 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.059063 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.059082 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.059098 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:35Z","lastTransitionTime":"2026-03-09T13:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.066383 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.076357 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.087341 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.097173 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.161558 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.161601 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.161610 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.161624 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.161634 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:35Z","lastTransitionTime":"2026-03-09T13:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.265018 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.265114 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.265129 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.265151 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.265167 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:35Z","lastTransitionTime":"2026-03-09T13:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.367231 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.367308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.367321 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.367338 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.367375 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:35Z","lastTransitionTime":"2026-03-09T13:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.470112 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.470143 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.470152 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.470164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.470173 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:35Z","lastTransitionTime":"2026-03-09T13:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.572103 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.572158 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.572173 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.572190 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.572205 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:35Z","lastTransitionTime":"2026-03-09T13:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.576309 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.589630 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.602808 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.615326 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.629338 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.642860 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.655638 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.664187 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.674148 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.674181 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.674191 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.674207 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.674217 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:35Z","lastTransitionTime":"2026-03-09T13:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.675663 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.689861 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.707517 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.717169 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.726997 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.777821 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.777888 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.777904 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.777928 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.777943 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:35Z","lastTransitionTime":"2026-03-09T13:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.880862 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.881146 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.881234 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.881344 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.881436 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:35Z","lastTransitionTime":"2026-03-09T13:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.984058 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.984127 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.984141 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.984160 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.984176 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:35Z","lastTransitionTime":"2026-03-09T13:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.087199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.087232 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.087243 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.087257 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.087267 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:36Z","lastTransitionTime":"2026-03-09T13:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.189574 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.189618 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.189627 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.189655 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.189664 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:36Z","lastTransitionTime":"2026-03-09T13:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.292236 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.292270 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.292280 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.292292 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.292300 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:36Z","lastTransitionTime":"2026-03-09T13:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.394608 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.394675 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.394687 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.394702 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.394713 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:36Z","lastTransitionTime":"2026-03-09T13:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.497513 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.497554 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.497563 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.497577 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.497587 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:36Z","lastTransitionTime":"2026-03-09T13:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.559066 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.559110 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.559181 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:36 crc kubenswrapper[4764]: E0309 13:22:36.559300 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:36 crc kubenswrapper[4764]: E0309 13:22:36.559408 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:36 crc kubenswrapper[4764]: E0309 13:22:36.559484 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.599409 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.599453 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.599465 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.599485 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.599496 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:36Z","lastTransitionTime":"2026-03-09T13:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.701933 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.701969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.701979 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.701993 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.702003 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:36Z","lastTransitionTime":"2026-03-09T13:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.804969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.805011 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.805021 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.805036 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.805048 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:36Z","lastTransitionTime":"2026-03-09T13:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.907595 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.907631 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.907670 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.907688 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.907700 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:36Z","lastTransitionTime":"2026-03-09T13:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.010501 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.010542 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.010562 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.010581 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.010595 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:37Z","lastTransitionTime":"2026-03-09T13:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.112349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.112400 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.112415 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.112436 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.112452 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:37Z","lastTransitionTime":"2026-03-09T13:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.215507 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.215544 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.215554 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.215570 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.215590 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:37Z","lastTransitionTime":"2026-03-09T13:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.318194 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.318233 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.318243 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.318260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.318273 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:37Z","lastTransitionTime":"2026-03-09T13:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.420758 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.420790 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.420798 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.420812 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.420821 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:37Z","lastTransitionTime":"2026-03-09T13:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.523784 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.523818 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.523826 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.523839 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.523848 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:37Z","lastTransitionTime":"2026-03-09T13:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.626767 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.626829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.626845 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.626871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.626887 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:37Z","lastTransitionTime":"2026-03-09T13:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.729279 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.729331 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.729343 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.729359 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.729370 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:37Z","lastTransitionTime":"2026-03-09T13:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.832325 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.832373 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.832387 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.832406 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.832418 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:37Z","lastTransitionTime":"2026-03-09T13:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.935251 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.935308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.935318 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.935335 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.935346 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:37Z","lastTransitionTime":"2026-03-09T13:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.037722 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.037822 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.037834 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.037851 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.037863 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.140754 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.140793 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.140803 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.140818 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.140830 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.243881 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.243933 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.243944 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.243964 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.243975 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.346615 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.346682 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.346694 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.346709 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.346720 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.449385 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.449752 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.449864 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.449966 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.450064 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.552956 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.553013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.553024 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.553043 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.553086 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.559197 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:38 crc kubenswrapper[4764]: E0309 13:22:38.559346 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.559453 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:38 crc kubenswrapper[4764]: E0309 13:22:38.559660 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.560059 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:38 crc kubenswrapper[4764]: E0309 13:22:38.560185 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.655873 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.655920 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.655929 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.655947 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.655956 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.757965 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.758013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.758024 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.758041 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.758052 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.880936 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.880973 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.880980 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.880993 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.881001 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.983893 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.983963 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.983977 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.984002 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.984017 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.087038 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.087085 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.087102 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.087121 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.087136 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:39Z","lastTransitionTime":"2026-03-09T13:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.190246 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.190292 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.190300 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.190317 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.190327 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:39Z","lastTransitionTime":"2026-03-09T13:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.293432 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.293512 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.293525 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.293548 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.293562 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:39Z","lastTransitionTime":"2026-03-09T13:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.397023 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.397074 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.397089 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.397110 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.397129 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:39Z","lastTransitionTime":"2026-03-09T13:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.500158 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.500217 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.500230 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.500250 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.500265 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:39Z","lastTransitionTime":"2026-03-09T13:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.603399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.603436 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.603444 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.603461 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.603471 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:39Z","lastTransitionTime":"2026-03-09T13:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.705953 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.705985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.705993 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.706006 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.706014 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:39Z","lastTransitionTime":"2026-03-09T13:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.795960 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs"] Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.796795 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.799010 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.799186 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.809107 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.809173 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.809187 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.809206 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.809219 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:39Z","lastTransitionTime":"2026-03-09T13:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.809808 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.821021 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.829174 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.837342 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.845826 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.855267 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.867116 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.883065 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.891327 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.900220 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.909912 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.912618 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.912676 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.912688 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.912704 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.912716 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:39Z","lastTransitionTime":"2026-03-09T13:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.920300 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.930143 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.938017 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.947168 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.947216 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.957444 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.966812 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.966971 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdm9q\" (UniqueName: \"kubernetes.io/projected/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-kube-api-access-mdm9q\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.967014 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.967040 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.968460 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.979812 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.988103 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.999067 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.008423 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.014554 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.014594 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.014608 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.014628 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.014656 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:40Z","lastTransitionTime":"2026-03-09T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.019309 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.035743 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.044226 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.052558 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.064900 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.067746 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdm9q\" (UniqueName: \"kubernetes.io/projected/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-kube-api-access-mdm9q\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.067797 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.067816 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.067869 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.068545 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.068592 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.072231 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.073978 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.085334 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdm9q\" (UniqueName: \"kubernetes.io/projected/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-kube-api-access-mdm9q\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.085558 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.094481 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.109880 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.117312 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.117346 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.117355 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.117369 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.117378 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:40Z","lastTransitionTime":"2026-03-09T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:40 crc kubenswrapper[4764]: W0309 13:22:40.121348 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30a92c50_fe51_40d9_a69c_4b5fd722bfc6.slice/crio-b60d1a57310c4eecf9739792746a14c95b7b8a0de18df8ad29785d547b60b123 WatchSource:0}: Error finding container b60d1a57310c4eecf9739792746a14c95b7b8a0de18df8ad29785d547b60b123: Status 404 returned error can't find the container with id b60d1a57310c4eecf9739792746a14c95b7b8a0de18df8ad29785d547b60b123 Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.219801 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.219859 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.219871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.219888 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.219900 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:40Z","lastTransitionTime":"2026-03-09T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.324249 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.324283 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.324296 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.324313 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.324459 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:40Z","lastTransitionTime":"2026-03-09T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.426715 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.426802 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.426851 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.426876 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.426892 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:40Z","lastTransitionTime":"2026-03-09T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.508866 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-wkwdz"] Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.509422 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:40 crc kubenswrapper[4764]: E0309 13:22:40.509518 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.527843 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.529023 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.529065 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.529076 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.529104 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.529115 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:40Z","lastTransitionTime":"2026-03-09T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.543634 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.557804 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.559096 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.559144 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.559140 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:40 crc kubenswrapper[4764]: E0309 13:22:40.559194 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:40 crc kubenswrapper[4764]: E0309 13:22:40.559288 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:40 crc kubenswrapper[4764]: E0309 13:22:40.559465 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.569753 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.582950 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.594195 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.604780 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.614697 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.625351 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.631174 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.631206 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.631215 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.631230 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.631243 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:40Z","lastTransitionTime":"2026-03-09T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.633696 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.644182 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.657416 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.675591 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.675713 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsbrz\" (UniqueName: \"kubernetes.io/projected/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-kube-api-access-gsbrz\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.678684 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.689834 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.703345 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.733887 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.733933 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.733943 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.733975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.733987 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:40Z","lastTransitionTime":"2026-03-09T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.776568 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsbrz\" (UniqueName: \"kubernetes.io/projected/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-kube-api-access-gsbrz\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.776636 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:40 crc kubenswrapper[4764]: E0309 13:22:40.776776 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:40 crc kubenswrapper[4764]: E0309 13:22:40.776859 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs podName:6597fc34-10ee-4984-9c69-f4b7c0d46e2a nodeName:}" failed. No retries permitted until 2026-03-09 13:22:41.276831472 +0000 UTC m=+116.527003380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs") pod "network-metrics-daemon-wkwdz" (UID: "6597fc34-10ee-4984-9c69-f4b7c0d46e2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.800181 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsbrz\" (UniqueName: \"kubernetes.io/projected/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-kube-api-access-gsbrz\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.835838 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.836094 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.836167 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.836230 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.836332 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:40Z","lastTransitionTime":"2026-03-09T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.939115 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.939152 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.939160 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.939174 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.939184 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:40Z","lastTransitionTime":"2026-03-09T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.951088 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" event={"ID":"30a92c50-fe51-40d9-a69c-4b5fd722bfc6","Type":"ContainerStarted","Data":"10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.951348 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" event={"ID":"30a92c50-fe51-40d9-a69c-4b5fd722bfc6","Type":"ContainerStarted","Data":"73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.951450 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" event={"ID":"30a92c50-fe51-40d9-a69c-4b5fd722bfc6","Type":"ContainerStarted","Data":"b60d1a57310c4eecf9739792746a14c95b7b8a0de18df8ad29785d547b60b123"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.952323 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmzm7" event={"ID":"202a1f58-ce83-4374-ac48-dc806f7b9d6b","Type":"ContainerStarted","Data":"05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.972260 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.984193 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.996633 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.013400 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.030595 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.041504 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.041547 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.041555 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.041573 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.041582 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:41Z","lastTransitionTime":"2026-03-09T13:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.045966 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.061999 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.077156 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.090499 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.104966 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.121838 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.131077 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.140708 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.143924 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.144011 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.144033 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.144066 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.144089 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:41Z","lastTransitionTime":"2026-03-09T13:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.152623 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.170317 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.183565 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.197018 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.210799 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.229569 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.242948 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.246634 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.246705 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.246718 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.246739 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.246751 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:41Z","lastTransitionTime":"2026-03-09T13:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.259427 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.274220 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.283057 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:41 crc kubenswrapper[4764]: E0309 13:22:41.283331 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:41 crc kubenswrapper[4764]: E0309 13:22:41.283443 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs podName:6597fc34-10ee-4984-9c69-f4b7c0d46e2a nodeName:}" failed. No retries permitted until 2026-03-09 13:22:42.28341776 +0000 UTC m=+117.533589668 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs") pod "network-metrics-daemon-wkwdz" (UID: "6597fc34-10ee-4984-9c69-f4b7c0d46e2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.292540 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.310290 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.343103 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.350618 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.350726 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.350745 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.350774 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.350799 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:41Z","lastTransitionTime":"2026-03-09T13:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.363425 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.380059 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.394504 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.410499 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.426939 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.454980 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.455034 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.455045 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.455063 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.455078 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:41Z","lastTransitionTime":"2026-03-09T13:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.558872 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.559055 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.559134 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.559146 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.559175 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.559185 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:41Z","lastTransitionTime":"2026-03-09T13:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:41 crc kubenswrapper[4764]: E0309 13:22:41.559415 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.662661 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.662711 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.662722 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.662738 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.662748 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:41Z","lastTransitionTime":"2026-03-09T13:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.764867 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.764913 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.764922 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.764940 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.764952 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:41Z","lastTransitionTime":"2026-03-09T13:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.868036 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.868092 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.868110 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.868138 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.868159 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:41Z","lastTransitionTime":"2026-03-09T13:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.958510 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" event={"ID":"072442e6-8ece-4f72-a8cb-ad7ef1e3facb","Type":"ContainerStarted","Data":"efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f"} Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.970605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.970637 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.970661 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.970675 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.970684 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:41Z","lastTransitionTime":"2026-03-09T13:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.982670 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.004245 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.031374 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.046191 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.061662 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.073735 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.073777 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.073790 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.073810 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.073826 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:42Z","lastTransitionTime":"2026-03-09T13:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.074860 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.094120 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.107427 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.121561 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.136845 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.152077 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.165278 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.176737 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.176996 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.177031 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.177045 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.177070 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.177087 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:42Z","lastTransitionTime":"2026-03-09T13:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.188580 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.202079 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.281229 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.281331 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.281359 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.281399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.281438 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:42Z","lastTransitionTime":"2026-03-09T13:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.293173 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:42 crc kubenswrapper[4764]: E0309 13:22:42.293376 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:42 crc kubenswrapper[4764]: E0309 13:22:42.293445 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs podName:6597fc34-10ee-4984-9c69-f4b7c0d46e2a nodeName:}" failed. No retries permitted until 2026-03-09 13:22:44.293423211 +0000 UTC m=+119.543595119 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs") pod "network-metrics-daemon-wkwdz" (UID: "6597fc34-10ee-4984-9c69-f4b7c0d46e2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.384760 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.384807 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.384822 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.384844 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.384859 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:42Z","lastTransitionTime":"2026-03-09T13:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.488863 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.488952 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.488975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.489016 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.489042 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:42Z","lastTransitionTime":"2026-03-09T13:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.559771 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.559837 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.559923 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:42 crc kubenswrapper[4764]: E0309 13:22:42.560610 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:42 crc kubenswrapper[4764]: E0309 13:22:42.560727 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:42 crc kubenswrapper[4764]: E0309 13:22:42.560751 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.593933 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.593968 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.593979 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.593995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.594008 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:42Z","lastTransitionTime":"2026-03-09T13:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.696501 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.696535 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.696544 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.696558 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.696566 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:42Z","lastTransitionTime":"2026-03-09T13:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.798425 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.798459 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.798469 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.798481 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.798490 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:42Z","lastTransitionTime":"2026-03-09T13:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.900947 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.900983 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.900993 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.901007 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.901017 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:42Z","lastTransitionTime":"2026-03-09T13:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.925027 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.944079 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.961517 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.963177 4764 generic.go:334] "Generic (PLEG): container finished" podID="072442e6-8ece-4f72-a8cb-ad7ef1e3facb" containerID="efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f" exitCode=0 Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.963250 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" event={"ID":"072442e6-8ece-4f72-a8cb-ad7ef1e3facb","Type":"ContainerDied","Data":"efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.965707 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r5bnx" event={"ID":"7fede188-66d9-4cb1-af19-c94afe7fbcde","Type":"ContainerStarted","Data":"7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.967758 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.983122 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.998431 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.003289 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.003332 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.003341 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.003354 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.003364 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.010730 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.022758 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.036520 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.050934 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.063021 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.073126 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.083464 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.099239 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.105223 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.105261 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.105270 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.105285 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.105293 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.113867 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.124777 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.149102 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.170997 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.187344 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.199197 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.207531 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.207564 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.207574 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.207587 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.207596 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.212599 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.224319 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.241334 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.258585 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.269422 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.279367 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.288103 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.298760 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.307318 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.309445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.309484 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.309499 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.309520 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.309536 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.319147 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.331280 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.344744 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.352789 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.352814 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.352822 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.352835 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.352845 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: E0309 13:22:43.366257 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.369840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.370006 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.370102 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.370194 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.370285 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: E0309 13:22:43.384006 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.387677 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.387937 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.387955 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.387973 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.387983 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: E0309 13:22:43.402876 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.407033 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.407083 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.407099 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.407118 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.407131 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: E0309 13:22:43.420022 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.423587 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.423629 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.423638 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.423667 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.423678 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: E0309 13:22:43.435438 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: E0309 13:22:43.435550 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.437569 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.437631 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.437660 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.437681 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.437692 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.540039 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.540289 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.540366 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.540443 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.540504 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.565881 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:43 crc kubenswrapper[4764]: E0309 13:22:43.566049 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.642548 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.642589 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.642599 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.642611 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.642621 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.744470 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.744535 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.744546 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.744560 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.744568 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.846885 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.846922 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.846962 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.846977 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.846989 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.949948 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.949985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.949995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.950008 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.950017 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.972938 4764 generic.go:334] "Generic (PLEG): container finished" podID="072442e6-8ece-4f72-a8cb-ad7ef1e3facb" containerID="3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a" exitCode=0 Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.973000 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" event={"ID":"072442e6-8ece-4f72-a8cb-ad7ef1e3facb","Type":"ContainerDied","Data":"3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.983532 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.996777 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.026329 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.052896 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.064062 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.064140 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.064165 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.064200 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.064220 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:44Z","lastTransitionTime":"2026-03-09T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.064977 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.086422 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.101936 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.115249 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.125234 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.137221 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.147819 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.163145 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.167146 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.167178 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.167187 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.167199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.167227 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:44Z","lastTransitionTime":"2026-03-09T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.176607 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.191952 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.210444 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.269817 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.269869 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.269886 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.269909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.269926 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:44Z","lastTransitionTime":"2026-03-09T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.316128 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:44 crc kubenswrapper[4764]: E0309 13:22:44.316254 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:44 crc kubenswrapper[4764]: E0309 13:22:44.316297 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs podName:6597fc34-10ee-4984-9c69-f4b7c0d46e2a nodeName:}" failed. No retries permitted until 2026-03-09 13:22:48.316283947 +0000 UTC m=+123.566455855 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs") pod "network-metrics-daemon-wkwdz" (UID: "6597fc34-10ee-4984-9c69-f4b7c0d46e2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.372371 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.372406 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.372416 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.372431 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.372442 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:44Z","lastTransitionTime":"2026-03-09T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.474820 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.474852 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.474861 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.474874 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.474881 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:44Z","lastTransitionTime":"2026-03-09T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.558739 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.558775 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.558922 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:44 crc kubenswrapper[4764]: E0309 13:22:44.559036 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:44 crc kubenswrapper[4764]: E0309 13:22:44.559098 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:44 crc kubenswrapper[4764]: E0309 13:22:44.559751 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.565362 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.577887 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.578156 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.578167 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.578183 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.578193 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:44Z","lastTransitionTime":"2026-03-09T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.680524 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.680605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.680631 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.680698 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.680719 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:44Z","lastTransitionTime":"2026-03-09T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.784250 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.784281 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.784291 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.784304 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.784313 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:44Z","lastTransitionTime":"2026-03-09T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.886255 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.886296 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.886305 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.886319 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.886329 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:44Z","lastTransitionTime":"2026-03-09T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.978891 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.978946 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.980867 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.983290 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66" exitCode=0 Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.983359 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.989239 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.989284 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.989295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.989314 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.989325 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:44Z","lastTransitionTime":"2026-03-09T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.990964 4764 generic.go:334] "Generic (PLEG): container finished" podID="072442e6-8ece-4f72-a8cb-ad7ef1e3facb" containerID="1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91" exitCode=0 Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.991041 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" event={"ID":"072442e6-8ece-4f72-a8cb-ad7ef1e3facb","Type":"ContainerDied","Data":"1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.994033 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.011560 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.026813 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.039751 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.058126 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.074581 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.089977 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.091555 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.091599 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.091611 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.091682 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.091699 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:45Z","lastTransitionTime":"2026-03-09T13:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.111099 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.126118 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.138354 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.158038 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.178122 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.194008 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.194047 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.194057 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.194075 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.194084 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:45Z","lastTransitionTime":"2026-03-09T13:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.196273 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.214324 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.234372 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.249001 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.264562 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.278213 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.296065 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.296180 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.296196 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.296205 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.296284 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.296297 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:45Z","lastTransitionTime":"2026-03-09T13:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.309463 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.323720 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.335556 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.346937 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.363814 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.375503 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.389684 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.398249 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.398276 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.398284 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.398297 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.398308 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:45Z","lastTransitionTime":"2026-03-09T13:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.407699 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.420396 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.432095 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.441280 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.455160 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.467301 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: E0309 13:22:45.499506 4764 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.559299 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:45 crc kubenswrapper[4764]: E0309 13:22:45.559440 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.574789 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.587409 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.598990 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.613867 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.623415 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.634587 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: E0309 13:22:45.639318 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.649783 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.664613 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.671990 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.681544 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.690103 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.701989 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.712093 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.721721 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.731811 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.742719 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.997411 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519"} Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.997970 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29"} Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.997992 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01"} Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.998005 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb"} Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.998016 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4"} Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.999822 4764 generic.go:334] "Generic (PLEG): container finished" podID="072442e6-8ece-4f72-a8cb-ad7ef1e3facb" containerID="885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96" exitCode=0 Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.999855 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" event={"ID":"072442e6-8ece-4f72-a8cb-ad7ef1e3facb","Type":"ContainerDied","Data":"885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96"} Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.014845 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.032684 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.045317 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.061340 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.075573 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.086179 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.100780 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.118609 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.143613 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.154558 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.172952 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.189636 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.211189 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.222310 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.238467 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.250260 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.445618 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.445796 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:18.445759973 +0000 UTC m=+153.695931881 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.446053 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.446081 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.446109 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.446130 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446167 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446217 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:23:18.446208685 +0000 UTC m=+153.696380593 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446252 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446262 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446274 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446301 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446264 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446330 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:23:18.446312068 +0000 UTC m=+153.696484056 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446341 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446349 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:23:18.446339179 +0000 UTC m=+153.696511207 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446351 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446384 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:23:18.44637351 +0000 UTC m=+153.696545518 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.558853 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.558904 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.558973 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.559056 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.559363 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.559488 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.005882 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d"} Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.008295 4764 generic.go:334] "Generic (PLEG): container finished" podID="072442e6-8ece-4f72-a8cb-ad7ef1e3facb" containerID="2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f" exitCode=0 Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.008339 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" event={"ID":"072442e6-8ece-4f72-a8cb-ad7ef1e3facb","Type":"ContainerDied","Data":"2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f"} Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.019725 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.036394 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.048498 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.062807 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.075042 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.086074 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.096533 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.107399 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.116448 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.127574 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.139913 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.159934 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.171999 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.182724 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.191884 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.200554 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.558957 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:47 crc kubenswrapper[4764]: E0309 13:22:47.559113 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.014623 4764 generic.go:334] "Generic (PLEG): container finished" podID="072442e6-8ece-4f72-a8cb-ad7ef1e3facb" containerID="ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0" exitCode=0 Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.014683 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" event={"ID":"072442e6-8ece-4f72-a8cb-ad7ef1e3facb","Type":"ContainerDied","Data":"ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0"} Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.033093 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.053371 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.072350 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.090729 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.101047 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.116778 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.134036 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.154606 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.168798 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.182036 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.193739 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.208565 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.220510 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.232683 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.243994 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.256152 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.365298 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:48 crc kubenswrapper[4764]: E0309 13:22:48.365509 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:48 crc kubenswrapper[4764]: E0309 13:22:48.365599 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs podName:6597fc34-10ee-4984-9c69-f4b7c0d46e2a nodeName:}" failed. No retries permitted until 2026-03-09 13:22:56.365578455 +0000 UTC m=+131.615750373 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs") pod "network-metrics-daemon-wkwdz" (UID: "6597fc34-10ee-4984-9c69-f4b7c0d46e2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.559713 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.559749 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.559724 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:48 crc kubenswrapper[4764]: E0309 13:22:48.559844 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:48 crc kubenswrapper[4764]: E0309 13:22:48.559911 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:48 crc kubenswrapper[4764]: E0309 13:22:48.559989 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.022343 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" event={"ID":"072442e6-8ece-4f72-a8cb-ad7ef1e3facb","Type":"ContainerStarted","Data":"d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519"} Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.029162 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017"} Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.041523 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.056183 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.070060 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.082223 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.091720 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.105433 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.120611 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.132891 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.147083 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.165936 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.174122 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.182557 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.191324 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.201226 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.208580 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.217938 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.559837 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:49 crc kubenswrapper[4764]: E0309 13:22:49.560072 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:22:50 crc kubenswrapper[4764]: I0309 13:22:50.559825 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:50 crc kubenswrapper[4764]: I0309 13:22:50.559972 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:50 crc kubenswrapper[4764]: E0309 13:22:50.560367 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:50 crc kubenswrapper[4764]: I0309 13:22:50.560103 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:50 crc kubenswrapper[4764]: E0309 13:22:50.560698 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:50 crc kubenswrapper[4764]: E0309 13:22:50.560843 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:50 crc kubenswrapper[4764]: E0309 13:22:50.641059 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.039519 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b"} Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.039843 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.052145 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.064986 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.067989 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.076415 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.087730 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.098680 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.112499 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.123974 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.132703 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.142760 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.156812 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.179923 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.189279 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.199407 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.207529 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.218870 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.228858 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.241404 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.251281 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.261513 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.272101 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.283587 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.295558 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.305551 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.317429 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.327722 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.339303 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.349183 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.361187 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.373263 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.388604 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.397253 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.405786 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.559074 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:51 crc kubenswrapper[4764]: E0309 13:22:51.559201 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.043186 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.043248 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.070146 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.084598 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.095942 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.106321 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.117584 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.127996 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.140724 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.155053 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.166111 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.177936 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.196979 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.208392 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.221267 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.230192 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.247295 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.258602 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.278832 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.559479 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:52 crc kubenswrapper[4764]: E0309 13:22:52.559616 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.559797 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:52 crc kubenswrapper[4764]: E0309 13:22:52.559861 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.559974 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:52 crc kubenswrapper[4764]: E0309 13:22:52.560169 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.486132 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.486926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.486961 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.486990 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.487006 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:53Z","lastTransitionTime":"2026-03-09T13:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:53 crc kubenswrapper[4764]: E0309 13:22:53.501848 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:53Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.513695 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.513785 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.513807 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.513838 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.513858 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:53Z","lastTransitionTime":"2026-03-09T13:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:53 crc kubenswrapper[4764]: E0309 13:22:53.529323 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:53Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.533940 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.533988 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.534005 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.534028 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.534044 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:53Z","lastTransitionTime":"2026-03-09T13:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:53 crc kubenswrapper[4764]: E0309 13:22:53.548591 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:53Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.553210 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.553244 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.553254 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.553269 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.553279 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:53Z","lastTransitionTime":"2026-03-09T13:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.559888 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:53 crc kubenswrapper[4764]: E0309 13:22:53.560367 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:22:53 crc kubenswrapper[4764]: E0309 13:22:53.565496 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:53Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.570451 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.570499 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.570517 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.570537 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.570552 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:53Z","lastTransitionTime":"2026-03-09T13:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:53 crc kubenswrapper[4764]: E0309 13:22:53.587871 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:53Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:53 crc kubenswrapper[4764]: E0309 13:22:53.588127 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.051769 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/0.log" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.055895 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b" exitCode=1 Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.055938 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b"} Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.057637 4764 scope.go:117] "RemoveContainer" containerID="03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.073225 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.089670 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.104869 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.120147 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.137192 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.153298 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.167446 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.184484 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.199863 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.217256 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.240839 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"4 for removal\\\\nI0309 13:22:53.071206 6796 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 13:22:53.071312 6796 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 13:22:53.071247 6796 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 13:22:53.071352 6796 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:22:53.071377 6796 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 13:22:53.071443 6796 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:22:53.071459 6796 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 13:22:53.071465 6796 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 13:22:53.071481 6796 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 13:22:53.071492 6796 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 13:22:53.071500 6796 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:22:53.071697 6796 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:22:53.071710 6796 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:22:53.071739 6796 factory.go:656] Stopping watch factory\\\\nI0309 13:22:53.071758 6796 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:53.071779 6796 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.253668 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.264689 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.275914 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.289537 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.299034 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.558908 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.558969 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.559145 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:54 crc kubenswrapper[4764]: E0309 13:22:54.559181 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:54 crc kubenswrapper[4764]: E0309 13:22:54.559352 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:54 crc kubenswrapper[4764]: E0309 13:22:54.559473 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.060825 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/1.log" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.061624 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/0.log" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.064810 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2" exitCode=1 Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.064851 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2"} Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.064892 4764 scope.go:117] "RemoveContainer" containerID="03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.066095 4764 scope.go:117] "RemoveContainer" containerID="74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2" Mar 09 13:22:55 crc kubenswrapper[4764]: E0309 13:22:55.066409 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.083876 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.100861 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.118856 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.141086 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.159406 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.177700 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.195141 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.208717 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.228936 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.244065 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.257240 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.277612 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.299335 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"4 for removal\\\\nI0309 13:22:53.071206 6796 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 13:22:53.071312 6796 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 13:22:53.071247 6796 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 13:22:53.071352 6796 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:22:53.071377 6796 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 13:22:53.071443 6796 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:22:53.071459 6796 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 13:22:53.071465 6796 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 13:22:53.071481 6796 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 13:22:53.071492 6796 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 13:22:53.071500 6796 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:22:53.071697 6796 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:22:53.071710 6796 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:22:53.071739 6796 factory.go:656] Stopping watch factory\\\\nI0309 13:22:53.071758 6796 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:53.071779 6796 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:54Z\\\",\\\"message\\\":\\\"services_controller.go:444] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899454 6914 services_controller.go:445] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899473 6914 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nI0309 13:22:54.899513 6914 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:54.899390 6914 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}\\\\nI0309 13:22:54.899576 6914 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 13:22:54.899578 6914 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 1.726105ms\\\\nI0309 13:22:54.899624 6914 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-daemon for network=default\\\\nF0309 13:22:54.899660 6914 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.312828 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.322911 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.336211 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.559112 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:55 crc kubenswrapper[4764]: E0309 13:22:55.559325 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.578869 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.595258 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.608683 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.620821 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.632532 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: E0309 13:22:55.641497 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.648538 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.660147 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.684752 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.711701 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.732464 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"4 for removal\\\\nI0309 13:22:53.071206 6796 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 13:22:53.071312 6796 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 13:22:53.071247 6796 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 13:22:53.071352 6796 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:22:53.071377 6796 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 13:22:53.071443 6796 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:22:53.071459 6796 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 13:22:53.071465 6796 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 13:22:53.071481 6796 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 13:22:53.071492 6796 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 13:22:53.071500 6796 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:22:53.071697 6796 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:22:53.071710 6796 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:22:53.071739 6796 factory.go:656] Stopping watch factory\\\\nI0309 13:22:53.071758 6796 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:53.071779 6796 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:54Z\\\",\\\"message\\\":\\\"services_controller.go:444] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899454 6914 services_controller.go:445] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899473 6914 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nI0309 13:22:54.899513 6914 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:54.899390 6914 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}\\\\nI0309 13:22:54.899576 6914 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 13:22:54.899578 6914 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 1.726105ms\\\\nI0309 13:22:54.899624 6914 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-daemon for network=default\\\\nF0309 13:22:54.899660 6914 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.740777 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.752008 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.763457 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.778682 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.789782 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.801081 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.070862 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/1.log" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.076015 4764 scope.go:117] "RemoveContainer" containerID="74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2" Mar 09 13:22:56 crc kubenswrapper[4764]: E0309 13:22:56.076340 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.092041 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.110759 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.129268 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.161560 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:54Z\\\",\\\"message\\\":\\\"services_controller.go:444] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899454 6914 services_controller.go:445] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899473 6914 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nI0309 13:22:54.899513 6914 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:54.899390 6914 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}\\\\nI0309 13:22:54.899576 6914 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 13:22:54.899578 6914 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 1.726105ms\\\\nI0309 13:22:54.899624 6914 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-daemon for network=default\\\\nF0309 13:22:54.899660 6914 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.176804 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.191821 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.213409 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.235404 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.253929 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.268246 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.283150 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.297365 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.312707 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.324874 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.338064 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.351049 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.453184 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:56 crc kubenswrapper[4764]: E0309 13:22:56.453372 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:56 crc kubenswrapper[4764]: E0309 13:22:56.453595 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs podName:6597fc34-10ee-4984-9c69-f4b7c0d46e2a nodeName:}" failed. No retries permitted until 2026-03-09 13:23:12.453571613 +0000 UTC m=+147.703743591 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs") pod "network-metrics-daemon-wkwdz" (UID: "6597fc34-10ee-4984-9c69-f4b7c0d46e2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.559231 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.559271 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.559562 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:56 crc kubenswrapper[4764]: E0309 13:22:56.559780 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:56 crc kubenswrapper[4764]: E0309 13:22:56.559924 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:56 crc kubenswrapper[4764]: E0309 13:22:56.560144 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:57 crc kubenswrapper[4764]: I0309 13:22:57.559514 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:57 crc kubenswrapper[4764]: E0309 13:22:57.561206 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:22:58 crc kubenswrapper[4764]: I0309 13:22:58.558887 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:58 crc kubenswrapper[4764]: I0309 13:22:58.558887 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:58 crc kubenswrapper[4764]: E0309 13:22:58.559358 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:58 crc kubenswrapper[4764]: E0309 13:22:58.559266 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:58 crc kubenswrapper[4764]: I0309 13:22:58.558904 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:58 crc kubenswrapper[4764]: E0309 13:22:58.559439 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:58 crc kubenswrapper[4764]: I0309 13:22:58.722190 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:58 crc kubenswrapper[4764]: I0309 13:22:58.722930 4764 scope.go:117] "RemoveContainer" containerID="74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2" Mar 09 13:22:58 crc kubenswrapper[4764]: E0309 13:22:58.723141 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:22:59 crc kubenswrapper[4764]: I0309 13:22:59.559927 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:59 crc kubenswrapper[4764]: E0309 13:22:59.560073 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:00 crc kubenswrapper[4764]: I0309 13:23:00.558686 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:00 crc kubenswrapper[4764]: E0309 13:23:00.559088 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:00 crc kubenswrapper[4764]: I0309 13:23:00.558806 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:00 crc kubenswrapper[4764]: E0309 13:23:00.559402 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:00 crc kubenswrapper[4764]: I0309 13:23:00.558783 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:00 crc kubenswrapper[4764]: E0309 13:23:00.559683 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:00 crc kubenswrapper[4764]: E0309 13:23:00.642693 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:01 crc kubenswrapper[4764]: I0309 13:23:01.559824 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:01 crc kubenswrapper[4764]: E0309 13:23:01.560124 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:02 crc kubenswrapper[4764]: I0309 13:23:02.559284 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:02 crc kubenswrapper[4764]: E0309 13:23:02.559920 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:02 crc kubenswrapper[4764]: I0309 13:23:02.559394 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:02 crc kubenswrapper[4764]: E0309 13:23:02.560094 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:02 crc kubenswrapper[4764]: I0309 13:23:02.559341 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:02 crc kubenswrapper[4764]: E0309 13:23:02.560354 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.560051 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:03 crc kubenswrapper[4764]: E0309 13:23:03.560782 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.575049 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.619514 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.619736 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.619797 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.619886 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.619943 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:03Z","lastTransitionTime":"2026-03-09T13:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:03 crc kubenswrapper[4764]: E0309 13:23:03.632292 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.637363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.637604 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.637691 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.637782 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.637847 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:03Z","lastTransitionTime":"2026-03-09T13:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:03 crc kubenswrapper[4764]: E0309 13:23:03.652509 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.656863 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.656905 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.656915 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.656932 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.656943 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:03Z","lastTransitionTime":"2026-03-09T13:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:03 crc kubenswrapper[4764]: E0309 13:23:03.669514 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.673097 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.673136 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.673145 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.673159 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.673169 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:03Z","lastTransitionTime":"2026-03-09T13:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:03 crc kubenswrapper[4764]: E0309 13:23:03.683624 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.690182 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.690445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.690457 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.690472 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.690487 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:03Z","lastTransitionTime":"2026-03-09T13:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:03 crc kubenswrapper[4764]: E0309 13:23:03.703295 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:03 crc kubenswrapper[4764]: E0309 13:23:03.703427 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:23:04 crc kubenswrapper[4764]: I0309 13:23:04.559518 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:04 crc kubenswrapper[4764]: I0309 13:23:04.559545 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:04 crc kubenswrapper[4764]: E0309 13:23:04.559666 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:04 crc kubenswrapper[4764]: I0309 13:23:04.559562 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:04 crc kubenswrapper[4764]: E0309 13:23:04.559767 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:04 crc kubenswrapper[4764]: E0309 13:23:04.559897 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.559184 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:05 crc kubenswrapper[4764]: E0309 13:23:05.559349 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.576563 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.590638 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.608555 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.621552 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.638141 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: E0309 13:23:05.643486 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.658359 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.673674 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.694539 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.711197 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.725140 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.745928 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:54Z\\\",\\\"message\\\":\\\"services_controller.go:444] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899454 6914 services_controller.go:445] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899473 6914 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nI0309 13:22:54.899513 6914 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:54.899390 6914 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}\\\\nI0309 13:22:54.899576 6914 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 13:22:54.899578 6914 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 1.726105ms\\\\nI0309 13:22:54.899624 6914 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-daemon for network=default\\\\nF0309 13:22:54.899660 6914 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.761820 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.778555 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.792879 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.814016 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.825035 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.838082 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:06 crc kubenswrapper[4764]: I0309 13:23:06.559081 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:06 crc kubenswrapper[4764]: I0309 13:23:06.559217 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:06 crc kubenswrapper[4764]: E0309 13:23:06.559295 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:06 crc kubenswrapper[4764]: E0309 13:23:06.559487 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:06 crc kubenswrapper[4764]: I0309 13:23:06.559763 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:06 crc kubenswrapper[4764]: E0309 13:23:06.559962 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:07 crc kubenswrapper[4764]: I0309 13:23:07.559454 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:07 crc kubenswrapper[4764]: E0309 13:23:07.559606 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:07 crc kubenswrapper[4764]: I0309 13:23:07.574854 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 09 13:23:08 crc kubenswrapper[4764]: I0309 13:23:08.559688 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:08 crc kubenswrapper[4764]: I0309 13:23:08.559689 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:08 crc kubenswrapper[4764]: E0309 13:23:08.559842 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:08 crc kubenswrapper[4764]: I0309 13:23:08.559700 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:08 crc kubenswrapper[4764]: E0309 13:23:08.559998 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:08 crc kubenswrapper[4764]: E0309 13:23:08.560083 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:09 crc kubenswrapper[4764]: I0309 13:23:09.558930 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:09 crc kubenswrapper[4764]: E0309 13:23:09.559160 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:10 crc kubenswrapper[4764]: I0309 13:23:10.558934 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:10 crc kubenswrapper[4764]: E0309 13:23:10.559773 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:10 crc kubenswrapper[4764]: I0309 13:23:10.559045 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:10 crc kubenswrapper[4764]: E0309 13:23:10.559967 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:10 crc kubenswrapper[4764]: I0309 13:23:10.558990 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:10 crc kubenswrapper[4764]: E0309 13:23:10.560227 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:10 crc kubenswrapper[4764]: E0309 13:23:10.645346 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:11 crc kubenswrapper[4764]: I0309 13:23:11.559261 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:11 crc kubenswrapper[4764]: E0309 13:23:11.560181 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:12 crc kubenswrapper[4764]: I0309 13:23:12.529977 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:12 crc kubenswrapper[4764]: E0309 13:23:12.530160 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:23:12 crc kubenswrapper[4764]: E0309 13:23:12.530258 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs podName:6597fc34-10ee-4984-9c69-f4b7c0d46e2a nodeName:}" failed. No retries permitted until 2026-03-09 13:23:44.530238106 +0000 UTC m=+179.780410014 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs") pod "network-metrics-daemon-wkwdz" (UID: "6597fc34-10ee-4984-9c69-f4b7c0d46e2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:23:12 crc kubenswrapper[4764]: I0309 13:23:12.558795 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:12 crc kubenswrapper[4764]: I0309 13:23:12.558795 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:12 crc kubenswrapper[4764]: I0309 13:23:12.558845 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:12 crc kubenswrapper[4764]: E0309 13:23:12.559247 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:12 crc kubenswrapper[4764]: E0309 13:23:12.559304 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:12 crc kubenswrapper[4764]: E0309 13:23:12.559352 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:12 crc kubenswrapper[4764]: I0309 13:23:12.559524 4764 scope.go:117] "RemoveContainer" containerID="74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.124500 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/1.log" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.127496 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6"} Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.127925 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.144493 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:54Z\\\",\\\"message\\\":\\\"services_controller.go:444] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899454 6914 services_controller.go:445] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899473 6914 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nI0309 13:22:54.899513 6914 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:54.899390 6914 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}\\\\nI0309 13:22:54.899576 6914 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 13:22:54.899578 6914 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 1.726105ms\\\\nI0309 13:22:54.899624 6914 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-daemon for network=default\\\\nF0309 13:22:54.899660 6914 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:23:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.153923 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.164932 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.175557 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.186907 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.197126 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.208518 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.220446 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.228495 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.238180 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.246667 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.258589 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.271275 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.284838 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.301406 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.312686 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.323112 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.333733 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.559059 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:13 crc kubenswrapper[4764]: E0309 13:23:13.559789 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.573538 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.759274 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.759314 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.759325 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.759342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.759355 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:13Z","lastTransitionTime":"2026-03-09T13:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:13 crc kubenswrapper[4764]: E0309 13:23:13.772169 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.776168 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.776207 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.776216 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.776232 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.776241 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:13Z","lastTransitionTime":"2026-03-09T13:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:13 crc kubenswrapper[4764]: E0309 13:23:13.788149 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.791871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.791908 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.791919 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.791936 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.791946 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:13Z","lastTransitionTime":"2026-03-09T13:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:13 crc kubenswrapper[4764]: E0309 13:23:13.802465 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.805600 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.805663 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.805681 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.805702 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.805714 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:13Z","lastTransitionTime":"2026-03-09T13:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:13 crc kubenswrapper[4764]: E0309 13:23:13.816426 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.819111 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.819141 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.819152 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.819170 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.819180 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:13Z","lastTransitionTime":"2026-03-09T13:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:13 crc kubenswrapper[4764]: E0309 13:23:13.830001 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: E0309 13:23:13.830132 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.131888 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/2.log" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.133135 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/1.log" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.136345 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6" exitCode=1 Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.136435 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6"} Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.136506 4764 scope.go:117] "RemoveContainer" containerID="74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.137842 4764 scope.go:117] "RemoveContainer" containerID="850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6" Mar 09 13:23:14 crc kubenswrapper[4764]: E0309 13:23:14.138157 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.154884 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.167601 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.189918 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.204517 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.221980 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.236329 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.251570 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.267379 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.291407 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.312715 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.328637 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.347752 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:54Z\\\",\\\"message\\\":\\\"services_controller.go:444] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899454 6914 services_controller.go:445] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899473 6914 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nI0309 13:22:54.899513 6914 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:54.899390 6914 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}\\\\nI0309 13:22:54.899576 6914 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 13:22:54.899578 6914 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 1.726105ms\\\\nI0309 13:22:54.899624 6914 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-daemon for network=default\\\\nF0309 13:22:54.899660 6914 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"n.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:23:13.282082 7129 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0309 13:23:13.282128 7129 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.361582 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.370873 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.379993 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.392196 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.402083 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.411160 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.420963 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.559514 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.559611 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.559682 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:14 crc kubenswrapper[4764]: E0309 13:23:14.559704 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:14 crc kubenswrapper[4764]: E0309 13:23:14.559805 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:14 crc kubenswrapper[4764]: E0309 13:23:14.559898 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.144457 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/2.log" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.152216 4764 scope.go:117] "RemoveContainer" containerID="850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6" Mar 09 13:23:15 crc kubenswrapper[4764]: E0309 13:23:15.152531 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.167626 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.183684 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.199404 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.230446 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.249777 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.269704 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.288793 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.304062 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.321447 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.337285 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.353232 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.369318 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.394110 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"n.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:23:13.282082 7129 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0309 13:23:13.282128 7129 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.408365 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.423836 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.435576 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.452288 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.462974 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.480374 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.559004 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:15 crc kubenswrapper[4764]: E0309 13:23:15.559135 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.575412 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.594021 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.608855 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.622914 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.635144 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: E0309 13:23:15.646365 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.660867 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.676702 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.695036 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.708902 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.725384 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.742847 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.766597 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"n.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:23:13.282082 7129 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0309 13:23:13.282128 7129 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.777559 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.791407 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.803775 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.824155 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.836734 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.849398 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.862607 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:16 crc kubenswrapper[4764]: I0309 13:23:16.559049 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:16 crc kubenswrapper[4764]: I0309 13:23:16.559049 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:16 crc kubenswrapper[4764]: E0309 13:23:16.559705 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:16 crc kubenswrapper[4764]: I0309 13:23:16.559103 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:16 crc kubenswrapper[4764]: E0309 13:23:16.559932 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:16 crc kubenswrapper[4764]: E0309 13:23:16.560051 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:17 crc kubenswrapper[4764]: I0309 13:23:17.559704 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:17 crc kubenswrapper[4764]: E0309 13:23:17.559945 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:18 crc kubenswrapper[4764]: I0309 13:23:18.495022 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:18 crc kubenswrapper[4764]: I0309 13:23:18.495276 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:18 crc kubenswrapper[4764]: I0309 13:23:18.495326 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:18 crc kubenswrapper[4764]: I0309 13:23:18.495370 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.495483 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.495490 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:22.495443959 +0000 UTC m=+217.745615917 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.495506 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.495574 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:22.495546942 +0000 UTC m=+217.745719010 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.495599 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.495612 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:22.495590034 +0000 UTC m=+217.745762162 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.495624 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.495663 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.495715 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:22.495691967 +0000 UTC m=+217.745863875 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:23:18 crc kubenswrapper[4764]: I0309 13:23:18.495720 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.495999 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.496045 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.496075 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.496172 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:22.496148781 +0000 UTC m=+217.746320859 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:23:18 crc kubenswrapper[4764]: I0309 13:23:18.558882 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:18 crc kubenswrapper[4764]: I0309 13:23:18.558903 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.559024 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:18 crc kubenswrapper[4764]: I0309 13:23:18.559069 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.559199 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.559269 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:19 crc kubenswrapper[4764]: I0309 13:23:19.559463 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:19 crc kubenswrapper[4764]: E0309 13:23:19.559704 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:20 crc kubenswrapper[4764]: I0309 13:23:20.559422 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:20 crc kubenswrapper[4764]: I0309 13:23:20.559422 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:20 crc kubenswrapper[4764]: I0309 13:23:20.559495 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:20 crc kubenswrapper[4764]: E0309 13:23:20.561401 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:20 crc kubenswrapper[4764]: E0309 13:23:20.561728 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:20 crc kubenswrapper[4764]: E0309 13:23:20.562556 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:20 crc kubenswrapper[4764]: E0309 13:23:20.648454 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:21 crc kubenswrapper[4764]: I0309 13:23:21.559454 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:21 crc kubenswrapper[4764]: E0309 13:23:21.559615 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:22 crc kubenswrapper[4764]: I0309 13:23:22.558893 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:22 crc kubenswrapper[4764]: I0309 13:23:22.558939 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:22 crc kubenswrapper[4764]: E0309 13:23:22.559022 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:22 crc kubenswrapper[4764]: E0309 13:23:22.559088 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:22 crc kubenswrapper[4764]: I0309 13:23:22.559137 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:22 crc kubenswrapper[4764]: E0309 13:23:22.559233 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:23 crc kubenswrapper[4764]: I0309 13:23:23.559406 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:23 crc kubenswrapper[4764]: E0309 13:23:23.559582 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.132032 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.132094 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.132112 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.132140 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.132154 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:24Z","lastTransitionTime":"2026-03-09T13:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:24 crc kubenswrapper[4764]: E0309 13:23:24.158950 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.167023 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.167380 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.167450 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.167518 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.167582 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:24Z","lastTransitionTime":"2026-03-09T13:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:24 crc kubenswrapper[4764]: E0309 13:23:24.182829 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.187753 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.188110 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.188181 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.188253 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.188323 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:24Z","lastTransitionTime":"2026-03-09T13:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:24 crc kubenswrapper[4764]: E0309 13:23:24.204213 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.208909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.208972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.208985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.209022 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.209038 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:24Z","lastTransitionTime":"2026-03-09T13:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:24 crc kubenswrapper[4764]: E0309 13:23:24.225549 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.231471 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.231515 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.231526 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.231544 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.231554 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:24Z","lastTransitionTime":"2026-03-09T13:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:24 crc kubenswrapper[4764]: E0309 13:23:24.273755 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:24 crc kubenswrapper[4764]: E0309 13:23:24.273884 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.558694 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.558764 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.558704 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:24 crc kubenswrapper[4764]: E0309 13:23:24.558848 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:24 crc kubenswrapper[4764]: E0309 13:23:24.558918 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:24 crc kubenswrapper[4764]: E0309 13:23:24.559023 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.559373 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:25 crc kubenswrapper[4764]: E0309 13:23:25.559552 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.584918 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.604825 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.624066 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.638766 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: E0309 13:23:25.649344 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.660576 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.678608 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.692323 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.709879 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.727986 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.752623 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"n.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:23:13.282082 7129 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0309 13:23:13.282128 7129 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.769069 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.787336 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.803417 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.814563 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.829829 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.852294 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.868437 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.885515 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.910097 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:26 crc kubenswrapper[4764]: I0309 13:23:26.559279 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:26 crc kubenswrapper[4764]: I0309 13:23:26.559298 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:26 crc kubenswrapper[4764]: I0309 13:23:26.559312 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:26 crc kubenswrapper[4764]: E0309 13:23:26.560902 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:26 crc kubenswrapper[4764]: E0309 13:23:26.561008 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:26 crc kubenswrapper[4764]: E0309 13:23:26.561080 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:27 crc kubenswrapper[4764]: I0309 13:23:27.559161 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:27 crc kubenswrapper[4764]: E0309 13:23:27.559347 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.195094 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmzm7_202a1f58-ce83-4374-ac48-dc806f7b9d6b/kube-multus/0.log" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.195156 4764 generic.go:334] "Generic (PLEG): container finished" podID="202a1f58-ce83-4374-ac48-dc806f7b9d6b" containerID="05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331" exitCode=1 Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.195192 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmzm7" event={"ID":"202a1f58-ce83-4374-ac48-dc806f7b9d6b","Type":"ContainerDied","Data":"05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331"} Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.195669 4764 scope.go:117] "RemoveContainer" containerID="05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.212512 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.239242 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"n.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:23:13.282082 7129 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0309 13:23:13.282128 7129 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.250475 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.261418 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.270166 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.286357 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.300064 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.316822 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:27Z\\\",\\\"message\\\":\\\"2026-03-09T13:22:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db\\\\n2026-03-09T13:22:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db to /host/opt/cni/bin/\\\\n2026-03-09T13:22:42Z [verbose] multus-daemon started\\\\n2026-03-09T13:22:42Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:23:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.325194 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.335323 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.344990 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.362548 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.373298 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.384715 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.395218 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.405704 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.418737 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.429267 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.440965 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.559634 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.559687 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.559704 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:28 crc kubenswrapper[4764]: E0309 13:23:28.559774 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:28 crc kubenswrapper[4764]: E0309 13:23:28.559960 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:28 crc kubenswrapper[4764]: E0309 13:23:28.560087 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.200355 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmzm7_202a1f58-ce83-4374-ac48-dc806f7b9d6b/kube-multus/0.log" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.200452 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmzm7" event={"ID":"202a1f58-ce83-4374-ac48-dc806f7b9d6b","Type":"ContainerStarted","Data":"ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a"} Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.212565 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.225877 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.237473 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.250840 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:27Z\\\",\\\"message\\\":\\\"2026-03-09T13:22:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db\\\\n2026-03-09T13:22:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db to /host/opt/cni/bin/\\\\n2026-03-09T13:22:42Z [verbose] multus-daemon started\\\\n2026-03-09T13:22:42Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:23:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:23:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.266091 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.283210 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"n.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:23:13.282082 7129 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0309 13:23:13.282128 7129 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.295837 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.307813 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.317998 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.330992 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.354015 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.365105 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.375938 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.386192 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.400992 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.415452 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.428977 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.441907 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.456949 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.559180 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:29 crc kubenswrapper[4764]: E0309 13:23:29.559331 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.559993 4764 scope.go:117] "RemoveContainer" containerID="850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6" Mar 09 13:23:29 crc kubenswrapper[4764]: E0309 13:23:29.560159 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:23:30 crc kubenswrapper[4764]: I0309 13:23:30.559676 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:30 crc kubenswrapper[4764]: I0309 13:23:30.559777 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:30 crc kubenswrapper[4764]: I0309 13:23:30.559676 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:30 crc kubenswrapper[4764]: E0309 13:23:30.559800 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:30 crc kubenswrapper[4764]: E0309 13:23:30.559927 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:30 crc kubenswrapper[4764]: E0309 13:23:30.559990 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:30 crc kubenswrapper[4764]: E0309 13:23:30.651361 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:31 crc kubenswrapper[4764]: I0309 13:23:31.558657 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:31 crc kubenswrapper[4764]: E0309 13:23:31.558795 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:32 crc kubenswrapper[4764]: I0309 13:23:32.559462 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:32 crc kubenswrapper[4764]: I0309 13:23:32.559583 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:32 crc kubenswrapper[4764]: I0309 13:23:32.559462 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:32 crc kubenswrapper[4764]: E0309 13:23:32.559598 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:32 crc kubenswrapper[4764]: E0309 13:23:32.559760 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:32 crc kubenswrapper[4764]: E0309 13:23:32.559860 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:33 crc kubenswrapper[4764]: I0309 13:23:33.559713 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:33 crc kubenswrapper[4764]: E0309 13:23:33.559913 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.559487 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.559487 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:34 crc kubenswrapper[4764]: E0309 13:23:34.560176 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.559575 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:34 crc kubenswrapper[4764]: E0309 13:23:34.560261 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:34 crc kubenswrapper[4764]: E0309 13:23:34.560758 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.650034 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.650123 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.650145 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.650176 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.650198 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:34Z","lastTransitionTime":"2026-03-09T13:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:34 crc kubenswrapper[4764]: E0309 13:23:34.674456 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.680285 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.680360 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.680383 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.680412 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.680433 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:34Z","lastTransitionTime":"2026-03-09T13:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:34 crc kubenswrapper[4764]: E0309 13:23:34.700833 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.704572 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.704605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.704624 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.704638 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.704683 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:34Z","lastTransitionTime":"2026-03-09T13:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:34 crc kubenswrapper[4764]: E0309 13:23:34.721302 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.725756 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.725807 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.725821 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.725840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.725853 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:34Z","lastTransitionTime":"2026-03-09T13:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:34 crc kubenswrapper[4764]: E0309 13:23:34.737612 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.741158 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.741194 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.741207 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.741223 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.741234 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:34Z","lastTransitionTime":"2026-03-09T13:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:34 crc kubenswrapper[4764]: E0309 13:23:34.761210 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:34 crc kubenswrapper[4764]: E0309 13:23:34.761368 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.559448 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:35 crc kubenswrapper[4764]: E0309 13:23:35.559632 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.583241 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.597800 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.613982 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.630572 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.645100 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: E0309 13:23:35.651956 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.665066 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.680991 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.698160 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.710996 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.728190 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"n.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:23:13.282082 7129 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0309 13:23:13.282128 7129 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.738971 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.750165 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.763825 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.779376 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.795185 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.812384 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:27Z\\\",\\\"message\\\":\\\"2026-03-09T13:22:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db\\\\n2026-03-09T13:22:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db to /host/opt/cni/bin/\\\\n2026-03-09T13:22:42Z [verbose] multus-daemon started\\\\n2026-03-09T13:22:42Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:23:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:23:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.829403 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.848119 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.862583 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:36 crc kubenswrapper[4764]: I0309 13:23:36.559526 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:36 crc kubenswrapper[4764]: I0309 13:23:36.559589 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:36 crc kubenswrapper[4764]: E0309 13:23:36.559776 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:36 crc kubenswrapper[4764]: I0309 13:23:36.559820 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:36 crc kubenswrapper[4764]: E0309 13:23:36.559964 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:36 crc kubenswrapper[4764]: E0309 13:23:36.560153 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:37 crc kubenswrapper[4764]: I0309 13:23:37.559564 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:37 crc kubenswrapper[4764]: E0309 13:23:37.559824 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:38 crc kubenswrapper[4764]: I0309 13:23:38.558727 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:38 crc kubenswrapper[4764]: I0309 13:23:38.558818 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:38 crc kubenswrapper[4764]: E0309 13:23:38.558897 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:38 crc kubenswrapper[4764]: I0309 13:23:38.558961 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:38 crc kubenswrapper[4764]: E0309 13:23:38.559131 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:38 crc kubenswrapper[4764]: E0309 13:23:38.559180 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:39 crc kubenswrapper[4764]: I0309 13:23:39.559880 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:39 crc kubenswrapper[4764]: E0309 13:23:39.560116 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:40 crc kubenswrapper[4764]: I0309 13:23:40.559024 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:40 crc kubenswrapper[4764]: E0309 13:23:40.559539 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:40 crc kubenswrapper[4764]: I0309 13:23:40.559099 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:40 crc kubenswrapper[4764]: I0309 13:23:40.559178 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:40 crc kubenswrapper[4764]: E0309 13:23:40.559822 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:40 crc kubenswrapper[4764]: E0309 13:23:40.560012 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:40 crc kubenswrapper[4764]: I0309 13:23:40.561524 4764 scope.go:117] "RemoveContainer" containerID="850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6" Mar 09 13:23:40 crc kubenswrapper[4764]: E0309 13:23:40.653790 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.243567 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/2.log" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.245565 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f"} Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.246092 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.258799 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.269736 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.280108 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.292549 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.304312 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.315965 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:27Z\\\",\\\"message\\\":\\\"2026-03-09T13:22:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db\\\\n2026-03-09T13:22:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db to /host/opt/cni/bin/\\\\n2026-03-09T13:22:42Z [verbose] multus-daemon started\\\\n2026-03-09T13:22:42Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:23:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:23:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.330920 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.348512 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"n.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:23:13.282082 7129 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0309 13:23:13.282128 7129 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.358876 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.368925 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.379990 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.391588 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.401401 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.411075 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.420588 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.431631 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.443075 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.463912 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.476057 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.558846 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:41 crc kubenswrapper[4764]: E0309 13:23:41.559140 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.251525 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/3.log" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.252270 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/2.log" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.255351 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" exitCode=1 Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.255419 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f"} Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.255495 4764 scope.go:117] "RemoveContainer" containerID="850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.256405 4764 scope.go:117] "RemoveContainer" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:23:42 crc kubenswrapper[4764]: E0309 13:23:42.256651 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.267727 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.285405 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.302804 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.331577 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.344099 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.354998 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.366515 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.381703 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.393781 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.405756 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.423876 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.438729 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.455626 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"n.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:23:13.282082 7129 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0309 13:23:13.282128 7129 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:41Z\\\",\\\"message\\\":\\\"NetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:23:41.398455 7479 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:23:41.398816 7479 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:23:41.399063 7479 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:23:41.399497 7479 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:23:41.399594 7479 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:23:41.399615 7479 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:23:41.399627 7479 factory.go:656] Stopping watch factory\\\\nI0309 13:23:41.399641 7479 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:23:41.445020 7479 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0309 13:23:41.445268 7479 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0309 13:23:41.445408 7479 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:23:41.445455 7479 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 13:23:41.445595 7479 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.466400 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.477525 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.487982 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.500741 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.508842 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.518984 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:27Z\\\",\\\"message\\\":\\\"2026-03-09T13:22:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db\\\\n2026-03-09T13:22:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db to /host/opt/cni/bin/\\\\n2026-03-09T13:22:42Z [verbose] multus-daemon started\\\\n2026-03-09T13:22:42Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:23:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:23:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.558808 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.558846 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.558917 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:42 crc kubenswrapper[4764]: E0309 13:23:42.559051 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:42 crc kubenswrapper[4764]: E0309 13:23:42.559157 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:42 crc kubenswrapper[4764]: E0309 13:23:42.559229 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.259632 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/3.log" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.263152 4764 scope.go:117] "RemoveContainer" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:23:43 crc kubenswrapper[4764]: E0309 13:23:43.263355 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.273070 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.283981 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.303347 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.314301 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.325207 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.338422 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.350047 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.360880 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.370982 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.380610 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.391943 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.402438 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.412940 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.422259 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.433722 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.443858 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.455365 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:27Z\\\",\\\"message\\\":\\\"2026-03-09T13:22:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db\\\\n2026-03-09T13:22:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db to /host/opt/cni/bin/\\\\n2026-03-09T13:22:42Z [verbose] multus-daemon started\\\\n2026-03-09T13:22:42Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:23:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:23:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.470824 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.488307 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:41Z\\\",\\\"message\\\":\\\"NetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:23:41.398455 7479 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:23:41.398816 7479 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:23:41.399063 7479 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:23:41.399497 7479 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:23:41.399594 7479 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:23:41.399615 7479 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:23:41.399627 7479 factory.go:656] Stopping watch factory\\\\nI0309 13:23:41.399641 7479 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:23:41.445020 7479 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0309 13:23:41.445268 7479 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0309 13:23:41.445408 7479 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:23:41.445455 7479 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 13:23:41.445595 7479 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.559077 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:43 crc kubenswrapper[4764]: E0309 13:23:43.559284 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.558871 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.558964 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.559093 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.558920 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.559396 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.559567 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.585949 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.586723 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.587020 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs podName:6597fc34-10ee-4984-9c69-f4b7c0d46e2a nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.586834751 +0000 UTC m=+243.837006839 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs") pod "network-metrics-daemon-wkwdz" (UID: "6597fc34-10ee-4984-9c69-f4b7c0d46e2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.869023 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.869069 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.869078 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.869095 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.869107 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:44Z","lastTransitionTime":"2026-03-09T13:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.882958 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.887161 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.887201 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.887210 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.887223 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.887231 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:44Z","lastTransitionTime":"2026-03-09T13:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.906351 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.911080 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.911184 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.911231 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.911260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.911280 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:44Z","lastTransitionTime":"2026-03-09T13:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.926028 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.930338 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.930380 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.930389 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.930406 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.930427 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:44Z","lastTransitionTime":"2026-03-09T13:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.949884 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.955044 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.955088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.955096 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.955111 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.955123 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:44Z","lastTransitionTime":"2026-03-09T13:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.967172 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.967299 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.559691 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:45 crc kubenswrapper[4764]: E0309 13:23:45.559805 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.579295 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.590894 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.606486 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:45 crc kubenswrapper[4764]: E0309 13:23:45.654693 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.656878 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=89.656861313 podStartE2EDuration="1m29.656861313s" podCreationTimestamp="2026-03-09 13:22:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.656551303 +0000 UTC m=+180.906723211" watchObservedRunningTime="2026-03-09 13:23:45.656861313 +0000 UTC m=+180.907033221" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.673370 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=38.673352324 podStartE2EDuration="38.673352324s" podCreationTimestamp="2026-03-09 13:23:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.673043834 +0000 UTC m=+180.923215742" watchObservedRunningTime="2026-03-09 13:23:45.673352324 +0000 UTC m=+180.923524232" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.738597 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hbmjc" podStartSLOduration=115.738571904 podStartE2EDuration="1m55.738571904s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.728385158 +0000 UTC m=+180.978557066" watchObservedRunningTime="2026-03-09 13:23:45.738571904 +0000 UTC m=+180.988743812" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.750944 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" podStartSLOduration=114.750924757 podStartE2EDuration="1m54.750924757s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.739474802 +0000 UTC m=+180.989646750" watchObservedRunningTime="2026-03-09 13:23:45.750924757 +0000 UTC m=+181.001096665" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.776555 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-r5bnx" podStartSLOduration=115.77653816 podStartE2EDuration="1m55.77653816s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.772362491 +0000 UTC m=+181.022534399" watchObservedRunningTime="2026-03-09 13:23:45.77653816 +0000 UTC m=+181.026710068" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.803221 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zmzm7" podStartSLOduration=115.803202096 podStartE2EDuration="1m55.803202096s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.78850186 +0000 UTC m=+181.038673768" watchObservedRunningTime="2026-03-09 13:23:45.803202096 +0000 UTC m=+181.053374004" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.813546 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-crvdf" podStartSLOduration=115.813525065 podStartE2EDuration="1m55.813525065s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.802860115 +0000 UTC m=+181.053032053" watchObservedRunningTime="2026-03-09 13:23:45.813525065 +0000 UTC m=+181.063696973" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.814215 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=42.814209127 podStartE2EDuration="42.814209127s" podCreationTimestamp="2026-03-09 13:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.813431343 +0000 UTC m=+181.063603271" watchObservedRunningTime="2026-03-09 13:23:45.814209127 +0000 UTC m=+181.064381035" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.824122 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podStartSLOduration=115.824107063 podStartE2EDuration="1m55.824107063s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.823271147 +0000 UTC m=+181.073443055" watchObservedRunningTime="2026-03-09 13:23:45.824107063 +0000 UTC m=+181.074278981" Mar 09 13:23:46 crc kubenswrapper[4764]: I0309 13:23:46.559601 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:46 crc kubenswrapper[4764]: E0309 13:23:46.559817 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:46 crc kubenswrapper[4764]: I0309 13:23:46.559826 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:46 crc kubenswrapper[4764]: I0309 13:23:46.559925 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:46 crc kubenswrapper[4764]: E0309 13:23:46.560088 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:46 crc kubenswrapper[4764]: E0309 13:23:46.560207 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:47 crc kubenswrapper[4764]: I0309 13:23:47.559388 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:47 crc kubenswrapper[4764]: E0309 13:23:47.559776 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:48 crc kubenswrapper[4764]: I0309 13:23:48.558663 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:48 crc kubenswrapper[4764]: E0309 13:23:48.558825 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:48 crc kubenswrapper[4764]: I0309 13:23:48.558671 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:48 crc kubenswrapper[4764]: E0309 13:23:48.558912 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:48 crc kubenswrapper[4764]: I0309 13:23:48.558672 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:48 crc kubenswrapper[4764]: E0309 13:23:48.558989 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:49 crc kubenswrapper[4764]: I0309 13:23:49.558841 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:49 crc kubenswrapper[4764]: E0309 13:23:49.559072 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:50 crc kubenswrapper[4764]: I0309 13:23:50.559217 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:50 crc kubenswrapper[4764]: I0309 13:23:50.559221 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:50 crc kubenswrapper[4764]: E0309 13:23:50.559902 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:50 crc kubenswrapper[4764]: E0309 13:23:50.559818 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:50 crc kubenswrapper[4764]: I0309 13:23:50.559304 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:50 crc kubenswrapper[4764]: E0309 13:23:50.560047 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:50 crc kubenswrapper[4764]: E0309 13:23:50.655882 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:51 crc kubenswrapper[4764]: I0309 13:23:51.559470 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:51 crc kubenswrapper[4764]: E0309 13:23:51.559684 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:52 crc kubenswrapper[4764]: I0309 13:23:52.558929 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:52 crc kubenswrapper[4764]: I0309 13:23:52.559029 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:52 crc kubenswrapper[4764]: I0309 13:23:52.558959 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:52 crc kubenswrapper[4764]: E0309 13:23:52.559121 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:52 crc kubenswrapper[4764]: E0309 13:23:52.559271 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:52 crc kubenswrapper[4764]: E0309 13:23:52.559397 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:53 crc kubenswrapper[4764]: I0309 13:23:53.559459 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:53 crc kubenswrapper[4764]: E0309 13:23:53.559803 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:54 crc kubenswrapper[4764]: I0309 13:23:54.559592 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:54 crc kubenswrapper[4764]: I0309 13:23:54.559700 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:54 crc kubenswrapper[4764]: E0309 13:23:54.559747 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:54 crc kubenswrapper[4764]: I0309 13:23:54.559800 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:54 crc kubenswrapper[4764]: E0309 13:23:54.559982 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:54 crc kubenswrapper[4764]: E0309 13:23:54.560129 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.150370 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.150428 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.150440 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.150463 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.150477 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:55Z","lastTransitionTime":"2026-03-09T13:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.199787 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s"] Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.200381 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.202546 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.202596 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.202562 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.205576 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.232020 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=42.232001317 podStartE2EDuration="42.232001317s" podCreationTimestamp="2026-03-09 13:23:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:55.231237943 +0000 UTC m=+190.481409871" watchObservedRunningTime="2026-03-09 13:23:55.232001317 +0000 UTC m=+190.482173245" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.245974 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=71.245949129 podStartE2EDuration="1m11.245949129s" podCreationTimestamp="2026-03-09 13:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:55.243302347 +0000 UTC m=+190.493474255" watchObservedRunningTime="2026-03-09 13:23:55.245949129 +0000 UTC m=+190.496121037" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.298456 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c544e05e-a876-4e15-bcfe-947cad49b850-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.298537 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c544e05e-a876-4e15-bcfe-947cad49b850-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.298614 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c544e05e-a876-4e15-bcfe-947cad49b850-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.298633 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c544e05e-a876-4e15-bcfe-947cad49b850-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.298698 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c544e05e-a876-4e15-bcfe-947cad49b850-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.400632 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c544e05e-a876-4e15-bcfe-947cad49b850-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.400766 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c544e05e-a876-4e15-bcfe-947cad49b850-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.400844 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c544e05e-a876-4e15-bcfe-947cad49b850-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.400874 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c544e05e-a876-4e15-bcfe-947cad49b850-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.400907 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c544e05e-a876-4e15-bcfe-947cad49b850-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.400950 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c544e05e-a876-4e15-bcfe-947cad49b850-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.400961 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c544e05e-a876-4e15-bcfe-947cad49b850-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.402530 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c544e05e-a876-4e15-bcfe-947cad49b850-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.411317 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c544e05e-a876-4e15-bcfe-947cad49b850-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.427634 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c544e05e-a876-4e15-bcfe-947cad49b850-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.515288 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: W0309 13:23:55.530738 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc544e05e_a876_4e15_bcfe_947cad49b850.slice/crio-13736dcabd84d0c7eb41312a46b31318b883e83b56a656f38df72ee8b2fe4764 WatchSource:0}: Error finding container 13736dcabd84d0c7eb41312a46b31318b883e83b56a656f38df72ee8b2fe4764: Status 404 returned error can't find the container with id 13736dcabd84d0c7eb41312a46b31318b883e83b56a656f38df72ee8b2fe4764 Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.558950 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:55 crc kubenswrapper[4764]: E0309 13:23:55.560378 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.610735 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.620439 4764 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 09 13:23:55 crc kubenswrapper[4764]: E0309 13:23:55.656569 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:56 crc kubenswrapper[4764]: I0309 13:23:56.305982 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" event={"ID":"c544e05e-a876-4e15-bcfe-947cad49b850","Type":"ContainerStarted","Data":"d2b10a3465d5a7b3351bf4f381a7550f42a9f289553c4ac878e37c5c549ade28"} Mar 09 13:23:56 crc kubenswrapper[4764]: I0309 13:23:56.306060 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" event={"ID":"c544e05e-a876-4e15-bcfe-947cad49b850","Type":"ContainerStarted","Data":"13736dcabd84d0c7eb41312a46b31318b883e83b56a656f38df72ee8b2fe4764"} Mar 09 13:23:56 crc kubenswrapper[4764]: I0309 13:23:56.328097 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" podStartSLOduration=126.328075056 podStartE2EDuration="2m6.328075056s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:56.32790183 +0000 UTC m=+191.578073808" watchObservedRunningTime="2026-03-09 13:23:56.328075056 +0000 UTC m=+191.578246954" Mar 09 13:23:56 crc kubenswrapper[4764]: I0309 13:23:56.559683 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:56 crc kubenswrapper[4764]: I0309 13:23:56.559760 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:56 crc kubenswrapper[4764]: I0309 13:23:56.559792 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:56 crc kubenswrapper[4764]: E0309 13:23:56.559826 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:56 crc kubenswrapper[4764]: E0309 13:23:56.559870 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:56 crc kubenswrapper[4764]: E0309 13:23:56.559940 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:56 crc kubenswrapper[4764]: I0309 13:23:56.560558 4764 scope.go:117] "RemoveContainer" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:23:56 crc kubenswrapper[4764]: E0309 13:23:56.560753 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:23:57 crc kubenswrapper[4764]: I0309 13:23:57.559739 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:57 crc kubenswrapper[4764]: E0309 13:23:57.559922 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:58 crc kubenswrapper[4764]: I0309 13:23:58.559249 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:58 crc kubenswrapper[4764]: I0309 13:23:58.559303 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:58 crc kubenswrapper[4764]: E0309 13:23:58.559386 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:58 crc kubenswrapper[4764]: I0309 13:23:58.559441 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:58 crc kubenswrapper[4764]: E0309 13:23:58.559574 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:58 crc kubenswrapper[4764]: E0309 13:23:58.559772 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:59 crc kubenswrapper[4764]: I0309 13:23:59.559179 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:59 crc kubenswrapper[4764]: E0309 13:23:59.559514 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:00 crc kubenswrapper[4764]: I0309 13:24:00.559335 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:00 crc kubenswrapper[4764]: I0309 13:24:00.559336 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:00 crc kubenswrapper[4764]: I0309 13:24:00.559348 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:00 crc kubenswrapper[4764]: E0309 13:24:00.559526 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:00 crc kubenswrapper[4764]: E0309 13:24:00.559630 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:00 crc kubenswrapper[4764]: E0309 13:24:00.559766 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:00 crc kubenswrapper[4764]: E0309 13:24:00.657825 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:24:01 crc kubenswrapper[4764]: I0309 13:24:01.559621 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:01 crc kubenswrapper[4764]: E0309 13:24:01.559904 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:02 crc kubenswrapper[4764]: I0309 13:24:02.559072 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:02 crc kubenswrapper[4764]: I0309 13:24:02.559089 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:02 crc kubenswrapper[4764]: I0309 13:24:02.559169 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:02 crc kubenswrapper[4764]: E0309 13:24:02.559252 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:02 crc kubenswrapper[4764]: E0309 13:24:02.559419 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:02 crc kubenswrapper[4764]: E0309 13:24:02.559527 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:03 crc kubenswrapper[4764]: I0309 13:24:03.559421 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:03 crc kubenswrapper[4764]: E0309 13:24:03.559547 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:04 crc kubenswrapper[4764]: I0309 13:24:04.558978 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:04 crc kubenswrapper[4764]: I0309 13:24:04.559105 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:04 crc kubenswrapper[4764]: E0309 13:24:04.559190 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:04 crc kubenswrapper[4764]: I0309 13:24:04.559152 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:04 crc kubenswrapper[4764]: E0309 13:24:04.559692 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:04 crc kubenswrapper[4764]: E0309 13:24:04.559742 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:05 crc kubenswrapper[4764]: I0309 13:24:05.559364 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:05 crc kubenswrapper[4764]: E0309 13:24:05.560695 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:05 crc kubenswrapper[4764]: E0309 13:24:05.658439 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:24:06 crc kubenswrapper[4764]: I0309 13:24:06.559756 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:06 crc kubenswrapper[4764]: I0309 13:24:06.559834 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:06 crc kubenswrapper[4764]: I0309 13:24:06.559758 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:06 crc kubenswrapper[4764]: E0309 13:24:06.560008 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:06 crc kubenswrapper[4764]: E0309 13:24:06.560231 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:06 crc kubenswrapper[4764]: E0309 13:24:06.560279 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:07 crc kubenswrapper[4764]: I0309 13:24:07.559715 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:07 crc kubenswrapper[4764]: E0309 13:24:07.560049 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:08 crc kubenswrapper[4764]: I0309 13:24:08.559261 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:08 crc kubenswrapper[4764]: I0309 13:24:08.559372 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:08 crc kubenswrapper[4764]: E0309 13:24:08.559391 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:08 crc kubenswrapper[4764]: I0309 13:24:08.559487 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:08 crc kubenswrapper[4764]: E0309 13:24:08.559682 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:08 crc kubenswrapper[4764]: E0309 13:24:08.559854 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:09 crc kubenswrapper[4764]: I0309 13:24:09.558896 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:09 crc kubenswrapper[4764]: E0309 13:24:09.559068 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:10 crc kubenswrapper[4764]: I0309 13:24:10.559512 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:10 crc kubenswrapper[4764]: I0309 13:24:10.559581 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:10 crc kubenswrapper[4764]: I0309 13:24:10.560153 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:10 crc kubenswrapper[4764]: E0309 13:24:10.560290 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:10 crc kubenswrapper[4764]: E0309 13:24:10.560405 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:10 crc kubenswrapper[4764]: I0309 13:24:10.560458 4764 scope.go:117] "RemoveContainer" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:24:10 crc kubenswrapper[4764]: E0309 13:24:10.560535 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:10 crc kubenswrapper[4764]: E0309 13:24:10.560899 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:24:10 crc kubenswrapper[4764]: E0309 13:24:10.659684 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:24:11 crc kubenswrapper[4764]: I0309 13:24:11.558772 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:11 crc kubenswrapper[4764]: E0309 13:24:11.558901 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:12 crc kubenswrapper[4764]: I0309 13:24:12.559348 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:12 crc kubenswrapper[4764]: I0309 13:24:12.559430 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:12 crc kubenswrapper[4764]: E0309 13:24:12.559529 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:12 crc kubenswrapper[4764]: I0309 13:24:12.559563 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:12 crc kubenswrapper[4764]: E0309 13:24:12.559829 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:12 crc kubenswrapper[4764]: E0309 13:24:12.560012 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:13 crc kubenswrapper[4764]: I0309 13:24:13.558822 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:13 crc kubenswrapper[4764]: E0309 13:24:13.558968 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:14 crc kubenswrapper[4764]: I0309 13:24:14.360339 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmzm7_202a1f58-ce83-4374-ac48-dc806f7b9d6b/kube-multus/1.log" Mar 09 13:24:14 crc kubenswrapper[4764]: I0309 13:24:14.361333 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmzm7_202a1f58-ce83-4374-ac48-dc806f7b9d6b/kube-multus/0.log" Mar 09 13:24:14 crc kubenswrapper[4764]: I0309 13:24:14.361379 4764 generic.go:334] "Generic (PLEG): container finished" podID="202a1f58-ce83-4374-ac48-dc806f7b9d6b" containerID="ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a" exitCode=1 Mar 09 13:24:14 crc kubenswrapper[4764]: I0309 13:24:14.361406 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmzm7" event={"ID":"202a1f58-ce83-4374-ac48-dc806f7b9d6b","Type":"ContainerDied","Data":"ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a"} Mar 09 13:24:14 crc kubenswrapper[4764]: I0309 13:24:14.361442 4764 scope.go:117] "RemoveContainer" containerID="05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331" Mar 09 13:24:14 crc kubenswrapper[4764]: I0309 13:24:14.361822 4764 scope.go:117] "RemoveContainer" containerID="ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a" Mar 09 13:24:14 crc kubenswrapper[4764]: E0309 13:24:14.362007 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-zmzm7_openshift-multus(202a1f58-ce83-4374-ac48-dc806f7b9d6b)\"" pod="openshift-multus/multus-zmzm7" podUID="202a1f58-ce83-4374-ac48-dc806f7b9d6b" Mar 09 13:24:14 crc kubenswrapper[4764]: I0309 13:24:14.558839 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:14 crc kubenswrapper[4764]: I0309 13:24:14.558928 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:14 crc kubenswrapper[4764]: E0309 13:24:14.558967 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:14 crc kubenswrapper[4764]: I0309 13:24:14.558933 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:14 crc kubenswrapper[4764]: E0309 13:24:14.559072 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:14 crc kubenswrapper[4764]: E0309 13:24:14.559184 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:15 crc kubenswrapper[4764]: I0309 13:24:15.366790 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmzm7_202a1f58-ce83-4374-ac48-dc806f7b9d6b/kube-multus/1.log" Mar 09 13:24:15 crc kubenswrapper[4764]: I0309 13:24:15.558932 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:15 crc kubenswrapper[4764]: E0309 13:24:15.559963 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:15 crc kubenswrapper[4764]: E0309 13:24:15.660392 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:24:16 crc kubenswrapper[4764]: I0309 13:24:16.559085 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:16 crc kubenswrapper[4764]: I0309 13:24:16.559117 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:16 crc kubenswrapper[4764]: I0309 13:24:16.559139 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:16 crc kubenswrapper[4764]: E0309 13:24:16.559222 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:16 crc kubenswrapper[4764]: E0309 13:24:16.559316 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:16 crc kubenswrapper[4764]: E0309 13:24:16.559390 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:17 crc kubenswrapper[4764]: I0309 13:24:17.559753 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:17 crc kubenswrapper[4764]: E0309 13:24:17.559911 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:18 crc kubenswrapper[4764]: I0309 13:24:18.558710 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:18 crc kubenswrapper[4764]: I0309 13:24:18.558772 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:18 crc kubenswrapper[4764]: I0309 13:24:18.558729 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:18 crc kubenswrapper[4764]: E0309 13:24:18.558850 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:18 crc kubenswrapper[4764]: E0309 13:24:18.558946 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:18 crc kubenswrapper[4764]: E0309 13:24:18.558980 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:19 crc kubenswrapper[4764]: I0309 13:24:19.559907 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:19 crc kubenswrapper[4764]: E0309 13:24:19.560110 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:20 crc kubenswrapper[4764]: I0309 13:24:20.559121 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:20 crc kubenswrapper[4764]: I0309 13:24:20.559217 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:20 crc kubenswrapper[4764]: I0309 13:24:20.559324 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:20 crc kubenswrapper[4764]: E0309 13:24:20.559309 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:20 crc kubenswrapper[4764]: E0309 13:24:20.559508 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:20 crc kubenswrapper[4764]: E0309 13:24:20.559617 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:20 crc kubenswrapper[4764]: E0309 13:24:20.661552 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:24:21 crc kubenswrapper[4764]: I0309 13:24:21.559116 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:21 crc kubenswrapper[4764]: E0309 13:24:21.559385 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:22 crc kubenswrapper[4764]: I0309 13:24:22.559405 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:22 crc kubenswrapper[4764]: I0309 13:24:22.559449 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:22 crc kubenswrapper[4764]: I0309 13:24:22.559421 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.559553 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.559601 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.559665 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:22 crc kubenswrapper[4764]: I0309 13:24:22.590937 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:22 crc kubenswrapper[4764]: I0309 13:24:22.591024 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:22 crc kubenswrapper[4764]: I0309 13:24:22.591055 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:22 crc kubenswrapper[4764]: I0309 13:24:22.591096 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:22 crc kubenswrapper[4764]: I0309 13:24:22.591119 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.591168 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.591205 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:26:24.591172262 +0000 UTC m=+339.841344180 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.591246 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:26:24.591234864 +0000 UTC m=+339.841406782 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.591270 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.591347 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:26:24.591329797 +0000 UTC m=+339.841501705 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.591279 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.591420 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.591447 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.591207 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.592240 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.592267 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.592359 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:26:24.591498531 +0000 UTC m=+339.841670519 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.592409 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:26:24.592394575 +0000 UTC m=+339.842566503 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:24:23 crc kubenswrapper[4764]: I0309 13:24:23.559767 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:23 crc kubenswrapper[4764]: E0309 13:24:23.560109 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:24 crc kubenswrapper[4764]: I0309 13:24:24.559280 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:24 crc kubenswrapper[4764]: I0309 13:24:24.559337 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:24 crc kubenswrapper[4764]: I0309 13:24:24.559356 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:24 crc kubenswrapper[4764]: E0309 13:24:24.559454 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:24 crc kubenswrapper[4764]: E0309 13:24:24.559577 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:24 crc kubenswrapper[4764]: E0309 13:24:24.559677 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:25 crc kubenswrapper[4764]: I0309 13:24:25.559066 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:25 crc kubenswrapper[4764]: E0309 13:24:25.560476 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:25 crc kubenswrapper[4764]: I0309 13:24:25.561317 4764 scope.go:117] "RemoveContainer" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:24:25 crc kubenswrapper[4764]: E0309 13:24:25.662206 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:24:26 crc kubenswrapper[4764]: I0309 13:24:26.399959 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/3.log" Mar 09 13:24:26 crc kubenswrapper[4764]: I0309 13:24:26.402592 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29"} Mar 09 13:24:26 crc kubenswrapper[4764]: I0309 13:24:26.403049 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:24:26 crc kubenswrapper[4764]: I0309 13:24:26.427099 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wkwdz"] Mar 09 13:24:26 crc kubenswrapper[4764]: I0309 13:24:26.427229 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:26 crc kubenswrapper[4764]: E0309 13:24:26.427334 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:26 crc kubenswrapper[4764]: I0309 13:24:26.558768 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:26 crc kubenswrapper[4764]: I0309 13:24:26.558794 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:26 crc kubenswrapper[4764]: I0309 13:24:26.558795 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:26 crc kubenswrapper[4764]: E0309 13:24:26.558903 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:26 crc kubenswrapper[4764]: E0309 13:24:26.559036 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:26 crc kubenswrapper[4764]: E0309 13:24:26.559134 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:27 crc kubenswrapper[4764]: I0309 13:24:27.558993 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:27 crc kubenswrapper[4764]: E0309 13:24:27.559260 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:27 crc kubenswrapper[4764]: I0309 13:24:27.559451 4764 scope.go:117] "RemoveContainer" containerID="ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a" Mar 09 13:24:27 crc kubenswrapper[4764]: I0309 13:24:27.577129 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podStartSLOduration=157.577113176 podStartE2EDuration="2m37.577113176s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:26.43320687 +0000 UTC m=+221.683378788" watchObservedRunningTime="2026-03-09 13:24:27.577113176 +0000 UTC m=+222.827285084" Mar 09 13:24:28 crc kubenswrapper[4764]: I0309 13:24:28.411496 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmzm7_202a1f58-ce83-4374-ac48-dc806f7b9d6b/kube-multus/1.log" Mar 09 13:24:28 crc kubenswrapper[4764]: I0309 13:24:28.411902 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmzm7" event={"ID":"202a1f58-ce83-4374-ac48-dc806f7b9d6b","Type":"ContainerStarted","Data":"63894bb3203376238f9013fa9abe0d5e50f67b5675ce93e7ce0b6cdd92b326b3"} Mar 09 13:24:28 crc kubenswrapper[4764]: I0309 13:24:28.559518 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:28 crc kubenswrapper[4764]: E0309 13:24:28.559881 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:28 crc kubenswrapper[4764]: I0309 13:24:28.559538 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:28 crc kubenswrapper[4764]: E0309 13:24:28.560455 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:28 crc kubenswrapper[4764]: I0309 13:24:28.559518 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:28 crc kubenswrapper[4764]: E0309 13:24:28.560764 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:29 crc kubenswrapper[4764]: I0309 13:24:29.558931 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:29 crc kubenswrapper[4764]: E0309 13:24:29.559058 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:30 crc kubenswrapper[4764]: I0309 13:24:30.559492 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:30 crc kubenswrapper[4764]: E0309 13:24:30.560126 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:30 crc kubenswrapper[4764]: I0309 13:24:30.559529 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:30 crc kubenswrapper[4764]: E0309 13:24:30.560384 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:30 crc kubenswrapper[4764]: I0309 13:24:30.559512 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:30 crc kubenswrapper[4764]: E0309 13:24:30.560625 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:31 crc kubenswrapper[4764]: I0309 13:24:31.559128 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:31 crc kubenswrapper[4764]: I0309 13:24:31.561029 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 13:24:31 crc kubenswrapper[4764]: I0309 13:24:31.563107 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 13:24:32 crc kubenswrapper[4764]: I0309 13:24:32.558678 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:32 crc kubenswrapper[4764]: I0309 13:24:32.558710 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:32 crc kubenswrapper[4764]: I0309 13:24:32.558733 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:32 crc kubenswrapper[4764]: I0309 13:24:32.562861 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 09 13:24:32 crc kubenswrapper[4764]: I0309 13:24:32.563689 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 09 13:24:32 crc kubenswrapper[4764]: I0309 13:24:32.563730 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 09 13:24:32 crc kubenswrapper[4764]: I0309 13:24:32.564120 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.849943 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.893561 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gst9d"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.894432 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.903826 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.903837 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.903948 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.904482 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.904488 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.904632 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.905158 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.906343 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tgqwl"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.906582 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.906805 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.907234 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.908712 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ptjnd"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.909283 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.910448 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.914042 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.914117 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.914695 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.915385 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.915412 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.915414 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.915530 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.915796 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.915862 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.916182 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.916269 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.917308 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.929529 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.929623 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h4rc\" (UniqueName: \"kubernetes.io/projected/1ffb8d96-e6e4-4859-ae7d-37f900979485-kube-api-access-9h4rc\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.929937 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-image-import-ca\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.929992 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.930015 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1ffb8d96-e6e4-4859-ae7d-37f900979485-encryption-config\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.930036 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-audit\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.930067 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-etcd-serving-ca\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.930088 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ffb8d96-e6e4-4859-ae7d-37f900979485-serving-cert\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.930118 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1ffb8d96-e6e4-4859-ae7d-37f900979485-etcd-client\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.930147 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ffb8d96-e6e4-4859-ae7d-37f900979485-audit-dir\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.930175 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1ffb8d96-e6e4-4859-ae7d-37f900979485-node-pullsecrets\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.930197 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-config\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.930876 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.934515 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.934990 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.936911 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t8ft9"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.937843 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.939761 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.946436 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nj856"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.962173 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.965481 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.966408 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.970031 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sp2mq"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.970808 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.980782 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.982907 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.983162 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.983439 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.983630 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.984121 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.984847 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.986476 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.986895 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-8g9lj"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.987233 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c2m6k"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.987596 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.988487 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.988604 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.988730 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.988841 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.988852 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.989005 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.989171 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.989477 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.991418 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.991507 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nnrmm"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.991542 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.992253 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.992456 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.992821 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.992850 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.992922 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.992969 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.993022 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.993065 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.993149 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.993158 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.993236 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.993258 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.996358 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.996637 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.996799 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.996887 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.997006 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.997108 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.997199 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.998234 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.000286 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.000998 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.003993 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.004295 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.004546 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.007834 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.008058 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.008274 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.008445 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.008567 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.008684 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.009098 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.015131 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-x927s"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.015815 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-x927s" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.025895 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.026481 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.028548 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.028889 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.028985 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029082 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029146 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029169 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029222 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029259 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029344 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029349 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029426 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029430 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029471 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029503 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029626 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029753 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029768 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029840 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029907 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029949 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.030103 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.030191 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.030289 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.030363 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.030725 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.030985 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-client-ca\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.031015 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e025e897-5ff3-476b-81c9-afdd0ae7a25f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.036016 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e025e897-5ff3-476b-81c9-afdd0ae7a25f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.036386 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qxpr\" (UniqueName: \"kubernetes.io/projected/e025e897-5ff3-476b-81c9-afdd0ae7a25f-kube-api-access-2qxpr\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.036422 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-encryption-config\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.036662 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4951d770-ae8c-470a-982a-807c82112722-service-ca-bundle\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.036714 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f9zl\" (UniqueName: \"kubernetes.io/projected/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-kube-api-access-9f9zl\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.036861 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23f87d2b-2a92-4abb-a2a6-2de508837343-serving-cert\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037001 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b625331d-48ab-4d48-86fd-fe73466305ff-serving-cert\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037027 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p79zf\" (UniqueName: \"kubernetes.io/projected/b625331d-48ab-4d48-86fd-fe73466305ff-kube-api-access-p79zf\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037148 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037301 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-client-ca\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037329 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-oauth-config\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037354 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037379 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgk9n\" (UniqueName: \"kubernetes.io/projected/6dc446a1-b77b-4f15-ae5f-0141bf374cdd-kube-api-access-pgk9n\") pod \"openshift-apiserver-operator-796bbdcf4f-k5dfc\" (UID: \"6dc446a1-b77b-4f15-ae5f-0141bf374cdd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037401 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037420 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-config\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037439 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037457 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8fzv\" (UniqueName: \"kubernetes.io/projected/23f87d2b-2a92-4abb-a2a6-2de508837343-kube-api-access-j8fzv\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037569 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-config\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037696 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-service-ca\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037874 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-audit\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.038805 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037905 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dc446a1-b77b-4f15-ae5f-0141bf374cdd-config\") pod \"openshift-apiserver-operator-796bbdcf4f-k5dfc\" (UID: \"6dc446a1-b77b-4f15-ae5f-0141bf374cdd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.041073 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-trusted-ca\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.041223 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndfng\" (UniqueName: \"kubernetes.io/projected/61b85db0-a292-42a8-8296-d0e476d80c89-kube-api-access-ndfng\") pod \"openshift-config-operator-7777fb866f-fxv7j\" (UID: \"61b85db0-a292-42a8-8296-d0e476d80c89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.041357 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-audit-policies\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.041462 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/23f87d2b-2a92-4abb-a2a6-2de508837343-etcd-service-ca\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.041554 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-etcd-serving-ca\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.041847 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ffb8d96-e6e4-4859-ae7d-37f900979485-serving-cert\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.041930 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-serving-cert\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.042279 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1ffb8d96-e6e4-4859-ae7d-37f900979485-etcd-client\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.042374 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-trusted-ca-bundle\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.042490 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4125448d-5832-43c2-8dba-d95adde7458a-config\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.042618 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad49b4d8-1218-4a34-8455-831d0f563cbf-serving-cert\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.042756 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.043100 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4951d770-ae8c-470a-982a-807c82112722-config\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.043211 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48tpb\" (UniqueName: \"kubernetes.io/projected/ad49b4d8-1218-4a34-8455-831d0f563cbf-kube-api-access-48tpb\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.043529 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4125448d-5832-43c2-8dba-d95adde7458a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.043671 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.043927 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e025e897-5ff3-476b-81c9-afdd0ae7a25f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.044035 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h4rc\" (UniqueName: \"kubernetes.io/projected/1ffb8d96-e6e4-4859-ae7d-37f900979485-kube-api-access-9h4rc\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.044257 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dc446a1-b77b-4f15-ae5f-0141bf374cdd-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-k5dfc\" (UID: \"6dc446a1-b77b-4f15-ae5f-0141bf374cdd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.044521 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/23f87d2b-2a92-4abb-a2a6-2de508837343-etcd-client\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.046906 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-audit\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.047764 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-etcd-serving-ca\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.049317 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.046749 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.050603 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-etcd-client\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.051264 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf3d7b2a-75e7-4c07-9211-b66c64c15def-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-klv4l\" (UID: \"cf3d7b2a-75e7-4c07-9211-b66c64c15def\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.051331 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-image-import-ca\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.051394 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-serving-cert\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.051419 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.051447 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-oauth-serving-cert\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.051706 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f87d2b-2a92-4abb-a2a6-2de508837343-config\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.051802 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.052176 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k77xq\" (UniqueName: \"kubernetes.io/projected/cf3d7b2a-75e7-4c07-9211-b66c64c15def-kube-api-access-k77xq\") pod \"openshift-controller-manager-operator-756b6f6bc6-klv4l\" (UID: \"cf3d7b2a-75e7-4c07-9211-b66c64c15def\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.052317 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.052856 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/23f87d2b-2a92-4abb-a2a6-2de508837343-etcd-ca\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.052911 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-serving-cert\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.052988 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1ffb8d96-e6e4-4859-ae7d-37f900979485-encryption-config\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.053039 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/61b85db0-a292-42a8-8296-d0e476d80c89-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fxv7j\" (UID: \"61b85db0-a292-42a8-8296-d0e476d80c89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.053267 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-config\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.053667 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f9b3244b-8df0-4330-9887-4092260d416a-audit-dir\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.061456 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.061780 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgwzj\" (UniqueName: \"kubernetes.io/projected/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-kube-api-access-bgwzj\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.061812 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc9tr\" (UniqueName: \"kubernetes.io/projected/4125448d-5832-43c2-8dba-d95adde7458a-kube-api-access-vc9tr\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.061837 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4951d770-ae8c-470a-982a-807c82112722-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.061855 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61b85db0-a292-42a8-8296-d0e476d80c89-serving-cert\") pod \"openshift-config-operator-7777fb866f-fxv7j\" (UID: \"61b85db0-a292-42a8-8296-d0e476d80c89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.061871 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.061888 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.061978 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4951d770-ae8c-470a-982a-807c82112722-serving-cert\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062003 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf3d7b2a-75e7-4c07-9211-b66c64c15def-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-klv4l\" (UID: \"cf3d7b2a-75e7-4c07-9211-b66c64c15def\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062019 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062050 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdsds\" (UniqueName: \"kubernetes.io/projected/f9b3244b-8df0-4330-9887-4092260d416a-kube-api-access-tdsds\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062104 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ffb8d96-e6e4-4859-ae7d-37f900979485-audit-dir\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062123 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062142 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x6ch\" (UniqueName: \"kubernetes.io/projected/baee6113-40a8-468e-b343-09e9afd65ce3-kube-api-access-8x6ch\") pod \"dns-operator-744455d44c-c2m6k\" (UID: \"baee6113-40a8-468e-b343-09e9afd65ce3\") " pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062165 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-config\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062180 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062197 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn7dv\" (UniqueName: \"kubernetes.io/projected/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-kube-api-access-vn7dv\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062214 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-audit-policies\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062229 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-audit-dir\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062246 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrcvk\" (UniqueName: \"kubernetes.io/projected/4951d770-ae8c-470a-982a-807c82112722-kube-api-access-lrcvk\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062265 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/baee6113-40a8-468e-b343-09e9afd65ce3-metrics-tls\") pod \"dns-operator-744455d44c-c2m6k\" (UID: \"baee6113-40a8-468e-b343-09e9afd65ce3\") " pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062287 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1ffb8d96-e6e4-4859-ae7d-37f900979485-node-pullsecrets\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062304 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-config\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062320 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4125448d-5832-43c2-8dba-d95adde7458a-images\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062440 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ffb8d96-e6e4-4859-ae7d-37f900979485-audit-dir\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062519 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1ffb8d96-e6e4-4859-ae7d-37f900979485-node-pullsecrets\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.063015 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-config\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.065347 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ffb8d96-e6e4-4859-ae7d-37f900979485-serving-cert\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.065420 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.066036 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-w49hn"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.066303 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.066603 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.066694 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-image-import-ca\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.066902 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.067340 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.067524 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.067832 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.067865 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.067978 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.068107 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.068172 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.068215 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.068612 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.069223 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1ffb8d96-e6e4-4859-ae7d-37f900979485-etcd-client\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.070372 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.071125 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.074269 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.074438 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.075047 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.075782 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.076799 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.077378 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.077537 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.079831 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.080346 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qddjs"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.080523 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.080371 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.080949 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.081000 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.081469 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hf98d"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.081603 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.082194 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.082518 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.082707 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1ffb8d96-e6e4-4859-ae7d-37f900979485-encryption-config\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.083062 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.084742 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-svr2n"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.085425 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.085862 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gst9d"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.087138 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.087842 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.088526 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.089526 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.089825 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.090616 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.090853 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.091567 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.091681 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-gnnbl"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.092674 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.092688 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.093136 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.093599 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d4gwh"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.094044 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.094527 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.095063 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.095634 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.096098 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.096629 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.096910 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551044-p748f"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.097400 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551044-p748f" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.098559 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tgqwl"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.099464 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ptjnd"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.101679 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.102115 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.102176 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.102687 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.106051 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.106698 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.109554 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t8ft9"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.117806 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.119377 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c2m6k"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.122946 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-k96kg"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.126006 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k96kg" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.128269 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nnrmm"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.131535 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hf98d"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.133491 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8g9lj"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.136146 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.136603 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qddjs"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.138260 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.139202 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.140511 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.141610 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nj856"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.142931 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.144072 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.145554 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sp2mq"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.146947 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.148254 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.149419 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.150506 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-x927s"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.151592 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-w49hn"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.152974 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.153985 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.155045 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.156198 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.156271 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.157267 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k96kg"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.158354 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551044-p748f"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.159452 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.160904 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d4gwh"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.162226 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163232 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61b85db0-a292-42a8-8296-d0e476d80c89-serving-cert\") pod \"openshift-config-operator-7777fb866f-fxv7j\" (UID: \"61b85db0-a292-42a8-8296-d0e476d80c89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163270 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163293 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163322 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgbg9\" (UniqueName: \"kubernetes.io/projected/b72bd4db-e5ea-44f6-bdce-81df2966acfb-kube-api-access-mgbg9\") pod \"catalog-operator-68c6474976-m9kmt\" (UID: \"b72bd4db-e5ea-44f6-bdce-81df2966acfb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163377 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163411 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdsds\" (UniqueName: \"kubernetes.io/projected/f9b3244b-8df0-4330-9887-4092260d416a-kube-api-access-tdsds\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163421 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163445 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163549 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x6ch\" (UniqueName: \"kubernetes.io/projected/baee6113-40a8-468e-b343-09e9afd65ce3-kube-api-access-8x6ch\") pod \"dns-operator-744455d44c-c2m6k\" (UID: \"baee6113-40a8-468e-b343-09e9afd65ce3\") " pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163579 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-config\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163606 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497-serving-cert\") pod \"service-ca-operator-777779d784-svr2n\" (UID: \"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163633 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-audit-dir\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163668 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrcvk\" (UniqueName: \"kubernetes.io/projected/4951d770-ae8c-470a-982a-807c82112722-kube-api-access-lrcvk\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163701 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/baee6113-40a8-468e-b343-09e9afd65ce3-metrics-tls\") pod \"dns-operator-744455d44c-c2m6k\" (UID: \"baee6113-40a8-468e-b343-09e9afd65ce3\") " pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163727 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-encryption-config\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163745 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-client-ca\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163767 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-default-certificate\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163806 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23f87d2b-2a92-4abb-a2a6-2de508837343-serving-cert\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163831 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p79zf\" (UniqueName: \"kubernetes.io/projected/b625331d-48ab-4d48-86fd-fe73466305ff-kube-api-access-p79zf\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163852 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163871 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9c0d96b-ed96-4925-b890-8743879a8b38-service-ca-bundle\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163890 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497-config\") pod \"service-ca-operator-777779d784-svr2n\" (UID: \"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163907 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-client-ca\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163927 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b72bd4db-e5ea-44f6-bdce-81df2966acfb-srv-cert\") pod \"catalog-operator-68c6474976-m9kmt\" (UID: \"b72bd4db-e5ea-44f6-bdce-81df2966acfb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163951 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgk9n\" (UniqueName: \"kubernetes.io/projected/6dc446a1-b77b-4f15-ae5f-0141bf374cdd-kube-api-access-pgk9n\") pod \"openshift-apiserver-operator-796bbdcf4f-k5dfc\" (UID: \"6dc446a1-b77b-4f15-ae5f-0141bf374cdd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163973 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163996 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164015 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8fzv\" (UniqueName: \"kubernetes.io/projected/23f87d2b-2a92-4abb-a2a6-2de508837343-kube-api-access-j8fzv\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164047 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-config\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164066 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5qdx\" (UniqueName: \"kubernetes.io/projected/db0ec273-54d8-4753-b519-243b727a9efd-kube-api-access-f5qdx\") pod \"package-server-manager-789f6589d5-b2pz8\" (UID: \"db0ec273-54d8-4753-b519-243b727a9efd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164090 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dc446a1-b77b-4f15-ae5f-0141bf374cdd-config\") pod \"openshift-apiserver-operator-796bbdcf4f-k5dfc\" (UID: \"6dc446a1-b77b-4f15-ae5f-0141bf374cdd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164110 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-audit-policies\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164127 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/23f87d2b-2a92-4abb-a2a6-2de508837343-etcd-service-ca\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164144 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4912a02a-743d-4bbe-9063-7d99ccd3329a-proxy-tls\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164162 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39da5087-79bc-4154-b340-22183d9e4417-secret-volume\") pod \"collect-profiles-29551035-5fpgn\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164182 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-serving-cert\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164201 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g47h8\" (UniqueName: \"kubernetes.io/projected/0a005f65-920a-4cdd-b4da-a270953113aa-kube-api-access-g47h8\") pod \"auto-csr-approver-29551044-p748f\" (UID: \"0a005f65-920a-4cdd-b4da-a270953113aa\") " pod="openshift-infra/auto-csr-approver-29551044-p748f" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164222 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4125448d-5832-43c2-8dba-d95adde7458a-config\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164241 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164298 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-metrics-certs\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164317 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164320 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4125448d-5832-43c2-8dba-d95adde7458a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164372 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164381 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-apiservice-cert\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164426 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dc446a1-b77b-4f15-ae5f-0141bf374cdd-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-k5dfc\" (UID: \"6dc446a1-b77b-4f15-ae5f-0141bf374cdd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164507 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/23f87d2b-2a92-4abb-a2a6-2de508837343-etcd-client\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164880 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.165915 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dc446a1-b77b-4f15-ae5f-0141bf374cdd-config\") pod \"openshift-apiserver-operator-796bbdcf4f-k5dfc\" (UID: \"6dc446a1-b77b-4f15-ae5f-0141bf374cdd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.165929 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-config\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.166357 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.167014 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-config\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.167082 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-svr2n"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.167483 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-client-ca\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.167630 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dc446a1-b77b-4f15-ae5f-0141bf374cdd-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-k5dfc\" (UID: \"6dc446a1-b77b-4f15-ae5f-0141bf374cdd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.167695 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4125448d-5832-43c2-8dba-d95adde7458a-config\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.167920 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4125448d-5832-43c2-8dba-d95adde7458a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168078 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-audit-dir\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168248 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-audit-policies\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168298 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ba55602-0e3f-4722-b437-546732351bc4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9k28f\" (UID: \"4ba55602-0e3f-4722-b437-546732351bc4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168333 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168345 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf3d7b2a-75e7-4c07-9211-b66c64c15def-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-klv4l\" (UID: \"cf3d7b2a-75e7-4c07-9211-b66c64c15def\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168357 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/23f87d2b-2a92-4abb-a2a6-2de508837343-etcd-service-ca\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168359 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-serving-cert\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168400 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2bqx9"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168695 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168399 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-serving-cert\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168960 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/14b4aa8c-1066-4388-9442-07722e4c76c2-tmpfs\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169040 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-oauth-serving-cert\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169067 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf3d7b2a-75e7-4c07-9211-b66c64c15def-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-klv4l\" (UID: \"cf3d7b2a-75e7-4c07-9211-b66c64c15def\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169114 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/db0ec273-54d8-4753-b519-243b727a9efd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b2pz8\" (UID: \"db0ec273-54d8-4753-b519-243b727a9efd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169206 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/30a07c97-9d99-41be-956e-ba3d6505d318-srv-cert\") pod \"olm-operator-6b444d44fb-ptbpd\" (UID: \"30a07c97-9d99-41be-956e-ba3d6505d318\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169257 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpcrn\" (UniqueName: \"kubernetes.io/projected/4912a02a-743d-4bbe-9063-7d99ccd3329a-kube-api-access-jpcrn\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169307 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k77xq\" (UniqueName: \"kubernetes.io/projected/cf3d7b2a-75e7-4c07-9211-b66c64c15def-kube-api-access-k77xq\") pod \"openshift-controller-manager-operator-756b6f6bc6-klv4l\" (UID: \"cf3d7b2a-75e7-4c07-9211-b66c64c15def\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169333 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f87d2b-2a92-4abb-a2a6-2de508837343-config\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169355 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/23f87d2b-2a92-4abb-a2a6-2de508837343-etcd-ca\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169376 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/61b85db0-a292-42a8-8296-d0e476d80c89-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fxv7j\" (UID: \"61b85db0-a292-42a8-8296-d0e476d80c89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169400 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-config\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169422 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f9b3244b-8df0-4330-9887-4092260d416a-audit-dir\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169447 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169474 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdqjn\" (UniqueName: \"kubernetes.io/projected/4ba55602-0e3f-4722-b437-546732351bc4-kube-api-access-fdqjn\") pod \"control-plane-machine-set-operator-78cbb6b69f-9k28f\" (UID: \"4ba55602-0e3f-4722-b437-546732351bc4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169501 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc9tr\" (UniqueName: \"kubernetes.io/projected/4125448d-5832-43c2-8dba-d95adde7458a-kube-api-access-vc9tr\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169525 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/711447e6-e7cf-4577-8050-b5a391f96f6a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xfvtf\" (UID: \"711447e6-e7cf-4577-8050-b5a391f96f6a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169541 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/baee6113-40a8-468e-b343-09e9afd65ce3-metrics-tls\") pod \"dns-operator-744455d44c-c2m6k\" (UID: \"baee6113-40a8-468e-b343-09e9afd65ce3\") " pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169549 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4912a02a-743d-4bbe-9063-7d99ccd3329a-images\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169577 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4951d770-ae8c-470a-982a-807c82112722-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169607 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2l2r\" (UniqueName: \"kubernetes.io/projected/ad307984-e46b-466b-8a5c-63a00976fbbf-kube-api-access-b2l2r\") pod \"cluster-samples-operator-665b6dd947-5g2pp\" (UID: \"ad307984-e46b-466b-8a5c-63a00976fbbf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169635 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4912a02a-743d-4bbe-9063-7d99ccd3329a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169698 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4951d770-ae8c-470a-982a-807c82112722-serving-cert\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169728 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf3d7b2a-75e7-4c07-9211-b66c64c15def-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-klv4l\" (UID: \"cf3d7b2a-75e7-4c07-9211-b66c64c15def\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169758 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169763 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61b85db0-a292-42a8-8296-d0e476d80c89-serving-cert\") pod \"openshift-config-operator-7777fb866f-fxv7j\" (UID: \"61b85db0-a292-42a8-8296-d0e476d80c89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169796 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169826 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn7dv\" (UniqueName: \"kubernetes.io/projected/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-kube-api-access-vn7dv\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169835 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169854 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-audit-policies\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169888 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0959a00-2a83-457f-bcba-7d4af48b11c3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tvsss\" (UID: \"d0959a00-2a83-457f-bcba-7d4af48b11c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169940 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4125448d-5832-43c2-8dba-d95adde7458a-images\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169970 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67pl5\" (UniqueName: \"kubernetes.io/projected/b9c0d96b-ed96-4925-b890-8743879a8b38-kube-api-access-67pl5\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170000 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e025e897-5ff3-476b-81c9-afdd0ae7a25f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170026 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e025e897-5ff3-476b-81c9-afdd0ae7a25f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170051 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qxpr\" (UniqueName: \"kubernetes.io/projected/e025e897-5ff3-476b-81c9-afdd0ae7a25f-kube-api-access-2qxpr\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170082 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4951d770-ae8c-470a-982a-807c82112722-service-ca-bundle\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170109 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f9zl\" (UniqueName: \"kubernetes.io/projected/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-kube-api-access-9f9zl\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170121 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170139 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b625331d-48ab-4d48-86fd-fe73466305ff-serving-cert\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170170 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/30a07c97-9d99-41be-956e-ba3d6505d318-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ptbpd\" (UID: \"30a07c97-9d99-41be-956e-ba3d6505d318\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170198 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-oauth-config\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170228 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170255 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-config\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170281 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-service-ca\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170309 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170334 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-webhook-cert\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170364 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/711447e6-e7cf-4577-8050-b5a391f96f6a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xfvtf\" (UID: \"711447e6-e7cf-4577-8050-b5a391f96f6a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170387 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/61b85db0-a292-42a8-8296-d0e476d80c89-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fxv7j\" (UID: \"61b85db0-a292-42a8-8296-d0e476d80c89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170392 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-trusted-ca\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170466 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndfng\" (UniqueName: \"kubernetes.io/projected/61b85db0-a292-42a8-8296-d0e476d80c89-kube-api-access-ndfng\") pod \"openshift-config-operator-7777fb866f-fxv7j\" (UID: \"61b85db0-a292-42a8-8296-d0e476d80c89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170501 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711447e6-e7cf-4577-8050-b5a391f96f6a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xfvtf\" (UID: \"711447e6-e7cf-4577-8050-b5a391f96f6a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170530 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad307984-e46b-466b-8a5c-63a00976fbbf-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5g2pp\" (UID: \"ad307984-e46b-466b-8a5c-63a00976fbbf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170558 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss45h\" (UniqueName: \"kubernetes.io/projected/39da5087-79bc-4154-b340-22183d9e4417-kube-api-access-ss45h\") pod \"collect-profiles-29551035-5fpgn\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170588 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s85j9\" (UniqueName: \"kubernetes.io/projected/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-kube-api-access-s85j9\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170620 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad49b4d8-1218-4a34-8455-831d0f563cbf-serving-cert\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170667 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-trusted-ca-bundle\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170693 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.171264 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/23f87d2b-2a92-4abb-a2a6-2de508837343-etcd-ca\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.171355 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4951d770-ae8c-470a-982a-807c82112722-config\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.171480 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-trusted-ca\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.171496 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e025e897-5ff3-476b-81c9-afdd0ae7a25f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.171592 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6nhpx"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.171934 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-audit-policies\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170697 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4951d770-ae8c-470a-982a-807c82112722-config\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.172615 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-oauth-serving-cert\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.172870 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4951d770-ae8c-470a-982a-807c82112722-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.172964 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.173093 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.173700 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-config\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.173850 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174393 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4125448d-5832-43c2-8dba-d95adde7458a-images\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174464 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2bqx9"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174569 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-oauth-config\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174789 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad49b4d8-1218-4a34-8455-831d0f563cbf-serving-cert\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174854 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48tpb\" (UniqueName: \"kubernetes.io/projected/ad49b4d8-1218-4a34-8455-831d0f563cbf-kube-api-access-48tpb\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174859 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6nhpx"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174886 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9vzq\" (UniqueName: \"kubernetes.io/projected/14b4aa8c-1066-4388-9442-07722e4c76c2-kube-api-access-w9vzq\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174915 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174938 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-stats-auth\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174971 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e025e897-5ff3-476b-81c9-afdd0ae7a25f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174998 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0959a00-2a83-457f-bcba-7d4af48b11c3-config\") pod \"kube-apiserver-operator-766d6c64bb-tvsss\" (UID: \"d0959a00-2a83-457f-bcba-7d4af48b11c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175010 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-service-ca\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175027 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnv7g\" (UniqueName: \"kubernetes.io/projected/8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497-kube-api-access-vnv7g\") pod \"service-ca-operator-777779d784-svr2n\" (UID: \"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175211 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4951d770-ae8c-470a-982a-807c82112722-service-ca-bundle\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175295 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf3d7b2a-75e7-4c07-9211-b66c64c15def-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-klv4l\" (UID: \"cf3d7b2a-75e7-4c07-9211-b66c64c15def\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175314 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f9b3244b-8df0-4330-9887-4092260d416a-audit-dir\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175385 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-trusted-ca-bundle\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175502 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-etcd-client\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175619 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175713 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-client-ca\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175860 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0959a00-2a83-457f-bcba-7d4af48b11c3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tvsss\" (UID: \"d0959a00-2a83-457f-bcba-7d4af48b11c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175962 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175991 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdclx\" (UniqueName: \"kubernetes.io/projected/30a07c97-9d99-41be-956e-ba3d6505d318-kube-api-access-rdclx\") pod \"olm-operator-6b444d44fb-ptbpd\" (UID: \"30a07c97-9d99-41be-956e-ba3d6505d318\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.176039 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.176049 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-kdxg4"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.176100 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.176374 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-serving-cert\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.176482 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b72bd4db-e5ea-44f6-bdce-81df2966acfb-profile-collector-cert\") pod \"catalog-operator-68c6474976-m9kmt\" (UID: \"b72bd4db-e5ea-44f6-bdce-81df2966acfb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.176668 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-config\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.176857 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.177084 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-serving-cert\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.177170 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgwzj\" (UniqueName: \"kubernetes.io/projected/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-kube-api-access-bgwzj\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.177229 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da5087-79bc-4154-b340-22183d9e4417-config-volume\") pod \"collect-profiles-29551035-5fpgn\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.177754 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e025e897-5ff3-476b-81c9-afdd0ae7a25f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.177811 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/23f87d2b-2a92-4abb-a2a6-2de508837343-etcd-client\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.178080 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-encryption-config\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.179272 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.179287 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4951d770-ae8c-470a-982a-807c82112722-serving-cert\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.179372 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.179427 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b625331d-48ab-4d48-86fd-fe73466305ff-serving-cert\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.180338 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.180557 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-serving-cert\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.183369 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23f87d2b-2a92-4abb-a2a6-2de508837343-serving-cert\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.183678 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f87d2b-2a92-4abb-a2a6-2de508837343-config\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.183681 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.187255 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-etcd-client\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.213434 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h4rc\" (UniqueName: \"kubernetes.io/projected/1ffb8d96-e6e4-4859-ae7d-37f900979485-kube-api-access-9h4rc\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.215983 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.219267 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.236707 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.256430 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.276614 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.278741 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b72bd4db-e5ea-44f6-bdce-81df2966acfb-profile-collector-cert\") pod \"catalog-operator-68c6474976-m9kmt\" (UID: \"b72bd4db-e5ea-44f6-bdce-81df2966acfb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.278780 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da5087-79bc-4154-b340-22183d9e4417-config-volume\") pod \"collect-profiles-29551035-5fpgn\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.278804 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgbg9\" (UniqueName: \"kubernetes.io/projected/b72bd4db-e5ea-44f6-bdce-81df2966acfb-kube-api-access-mgbg9\") pod \"catalog-operator-68c6474976-m9kmt\" (UID: \"b72bd4db-e5ea-44f6-bdce-81df2966acfb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.278854 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497-serving-cert\") pod \"service-ca-operator-777779d784-svr2n\" (UID: \"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.278877 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-default-certificate\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.278900 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9c0d96b-ed96-4925-b890-8743879a8b38-service-ca-bundle\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.278914 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497-config\") pod \"service-ca-operator-777779d784-svr2n\" (UID: \"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.278934 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b72bd4db-e5ea-44f6-bdce-81df2966acfb-srv-cert\") pod \"catalog-operator-68c6474976-m9kmt\" (UID: \"b72bd4db-e5ea-44f6-bdce-81df2966acfb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.278963 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5qdx\" (UniqueName: \"kubernetes.io/projected/db0ec273-54d8-4753-b519-243b727a9efd-kube-api-access-f5qdx\") pod \"package-server-manager-789f6589d5-b2pz8\" (UID: \"db0ec273-54d8-4753-b519-243b727a9efd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279159 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4912a02a-743d-4bbe-9063-7d99ccd3329a-proxy-tls\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279212 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39da5087-79bc-4154-b340-22183d9e4417-secret-volume\") pod \"collect-profiles-29551035-5fpgn\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279240 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g47h8\" (UniqueName: \"kubernetes.io/projected/0a005f65-920a-4cdd-b4da-a270953113aa-kube-api-access-g47h8\") pod \"auto-csr-approver-29551044-p748f\" (UID: \"0a005f65-920a-4cdd-b4da-a270953113aa\") " pod="openshift-infra/auto-csr-approver-29551044-p748f" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279287 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-metrics-certs\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279305 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-apiservice-cert\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279387 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ba55602-0e3f-4722-b437-546732351bc4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9k28f\" (UID: \"4ba55602-0e3f-4722-b437-546732351bc4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279441 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/14b4aa8c-1066-4388-9442-07722e4c76c2-tmpfs\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279490 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/db0ec273-54d8-4753-b519-243b727a9efd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b2pz8\" (UID: \"db0ec273-54d8-4753-b519-243b727a9efd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279510 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/30a07c97-9d99-41be-956e-ba3d6505d318-srv-cert\") pod \"olm-operator-6b444d44fb-ptbpd\" (UID: \"30a07c97-9d99-41be-956e-ba3d6505d318\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279528 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpcrn\" (UniqueName: \"kubernetes.io/projected/4912a02a-743d-4bbe-9063-7d99ccd3329a-kube-api-access-jpcrn\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279615 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdqjn\" (UniqueName: \"kubernetes.io/projected/4ba55602-0e3f-4722-b437-546732351bc4-kube-api-access-fdqjn\") pod \"control-plane-machine-set-operator-78cbb6b69f-9k28f\" (UID: \"4ba55602-0e3f-4722-b437-546732351bc4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279631 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4912a02a-743d-4bbe-9063-7d99ccd3329a-images\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279735 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/711447e6-e7cf-4577-8050-b5a391f96f6a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xfvtf\" (UID: \"711447e6-e7cf-4577-8050-b5a391f96f6a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279754 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4912a02a-743d-4bbe-9063-7d99ccd3329a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279855 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2l2r\" (UniqueName: \"kubernetes.io/projected/ad307984-e46b-466b-8a5c-63a00976fbbf-kube-api-access-b2l2r\") pod \"cluster-samples-operator-665b6dd947-5g2pp\" (UID: \"ad307984-e46b-466b-8a5c-63a00976fbbf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279875 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279965 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0959a00-2a83-457f-bcba-7d4af48b11c3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tvsss\" (UID: \"d0959a00-2a83-457f-bcba-7d4af48b11c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279986 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67pl5\" (UniqueName: \"kubernetes.io/projected/b9c0d96b-ed96-4925-b890-8743879a8b38-kube-api-access-67pl5\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280014 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/14b4aa8c-1066-4388-9442-07722e4c76c2-tmpfs\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280088 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/30a07c97-9d99-41be-956e-ba3d6505d318-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ptbpd\" (UID: \"30a07c97-9d99-41be-956e-ba3d6505d318\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280109 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280178 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-webhook-cert\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280196 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/711447e6-e7cf-4577-8050-b5a391f96f6a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xfvtf\" (UID: \"711447e6-e7cf-4577-8050-b5a391f96f6a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280239 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711447e6-e7cf-4577-8050-b5a391f96f6a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xfvtf\" (UID: \"711447e6-e7cf-4577-8050-b5a391f96f6a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280258 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad307984-e46b-466b-8a5c-63a00976fbbf-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5g2pp\" (UID: \"ad307984-e46b-466b-8a5c-63a00976fbbf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280348 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss45h\" (UniqueName: \"kubernetes.io/projected/39da5087-79bc-4154-b340-22183d9e4417-kube-api-access-ss45h\") pod \"collect-profiles-29551035-5fpgn\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280398 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s85j9\" (UniqueName: \"kubernetes.io/projected/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-kube-api-access-s85j9\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280433 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9vzq\" (UniqueName: \"kubernetes.io/projected/14b4aa8c-1066-4388-9442-07722e4c76c2-kube-api-access-w9vzq\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280523 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-stats-auth\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280542 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0959a00-2a83-457f-bcba-7d4af48b11c3-config\") pod \"kube-apiserver-operator-766d6c64bb-tvsss\" (UID: \"d0959a00-2a83-457f-bcba-7d4af48b11c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280557 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnv7g\" (UniqueName: \"kubernetes.io/projected/8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497-kube-api-access-vnv7g\") pod \"service-ca-operator-777779d784-svr2n\" (UID: \"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280574 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0959a00-2a83-457f-bcba-7d4af48b11c3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tvsss\" (UID: \"d0959a00-2a83-457f-bcba-7d4af48b11c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280456 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4912a02a-743d-4bbe-9063-7d99ccd3329a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280827 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdclx\" (UniqueName: \"kubernetes.io/projected/30a07c97-9d99-41be-956e-ba3d6505d318-kube-api-access-rdclx\") pod \"olm-operator-6b444d44fb-ptbpd\" (UID: \"30a07c97-9d99-41be-956e-ba3d6505d318\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.285092 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad307984-e46b-466b-8a5c-63a00976fbbf-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5g2pp\" (UID: \"ad307984-e46b-466b-8a5c-63a00976fbbf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.300192 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.325372 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.339504 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.356919 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.377365 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.396597 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.417180 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.432129 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gst9d"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.437387 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.456372 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.477071 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.497311 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.517175 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.537780 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.545480 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/30a07c97-9d99-41be-956e-ba3d6505d318-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ptbpd\" (UID: \"30a07c97-9d99-41be-956e-ba3d6505d318\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.545502 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39da5087-79bc-4154-b340-22183d9e4417-secret-volume\") pod \"collect-profiles-29551035-5fpgn\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.546934 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b72bd4db-e5ea-44f6-bdce-81df2966acfb-profile-collector-cert\") pod \"catalog-operator-68c6474976-m9kmt\" (UID: \"b72bd4db-e5ea-44f6-bdce-81df2966acfb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.557701 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.577841 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.580565 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da5087-79bc-4154-b340-22183d9e4417-config-volume\") pod \"collect-profiles-29551035-5fpgn\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.597780 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.618020 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.623755 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4912a02a-743d-4bbe-9063-7d99ccd3329a-proxy-tls\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.637390 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.641839 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4912a02a-743d-4bbe-9063-7d99ccd3329a-images\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.671499 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.676736 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.681980 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0959a00-2a83-457f-bcba-7d4af48b11c3-config\") pod \"kube-apiserver-operator-766d6c64bb-tvsss\" (UID: \"d0959a00-2a83-457f-bcba-7d4af48b11c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.685130 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0959a00-2a83-457f-bcba-7d4af48b11c3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tvsss\" (UID: \"d0959a00-2a83-457f-bcba-7d4af48b11c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.697132 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.717025 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.738273 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.757760 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.776511 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.796916 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.816870 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.837516 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.858214 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.864309 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/db0ec273-54d8-4753-b519-243b727a9efd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b2pz8\" (UID: \"db0ec273-54d8-4753-b519-243b727a9efd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.877219 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.896467 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.916637 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.936977 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.958432 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.977713 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.998511 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.002898 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497-serving-cert\") pod \"service-ca-operator-777779d784-svr2n\" (UID: \"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.016854 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.038324 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.040500 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497-config\") pod \"service-ca-operator-777779d784-svr2n\" (UID: \"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.057350 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.077976 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.095383 4764 request.go:700] Waited for 1.00718331s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/secrets?fieldSelector=metadata.name%3Dkube-storage-version-migrator-sa-dockercfg-5xfcg&limit=500&resourceVersion=0 Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.097553 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.117256 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.137106 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.142678 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ba55602-0e3f-4722-b437-546732351bc4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9k28f\" (UID: \"4ba55602-0e3f-4722-b437-546732351bc4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.158070 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.186795 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.197472 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.217270 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.237083 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.257725 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.266910 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/711447e6-e7cf-4577-8050-b5a391f96f6a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xfvtf\" (UID: \"711447e6-e7cf-4577-8050-b5a391f96f6a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.277924 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279558 4764 secret.go:188] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279582 4764 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279628 4764 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279670 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-metrics-certs podName:b9c0d96b-ed96-4925-b890-8743879a8b38 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.779619088 +0000 UTC m=+233.029791016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-metrics-certs") pod "router-default-5444994796-gnnbl" (UID: "b9c0d96b-ed96-4925-b890-8743879a8b38") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279678 4764 secret.go:188] Couldn't get secret openshift-ingress/router-certs-default: failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279713 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9c0d96b-ed96-4925-b890-8743879a8b38-service-ca-bundle podName:b9c0d96b-ed96-4925-b890-8743879a8b38 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.77969103 +0000 UTC m=+233.029862958 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b9c0d96b-ed96-4925-b890-8743879a8b38-service-ca-bundle") pod "router-default-5444994796-gnnbl" (UID: "b9c0d96b-ed96-4925-b890-8743879a8b38") : failed to sync configmap cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279673 4764 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279719 4764 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279744 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-apiservice-cert podName:14b4aa8c-1066-4388-9442-07722e4c76c2 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.779731021 +0000 UTC m=+233.029902939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-apiservice-cert") pod "packageserver-d55dfcdfc-m4bs6" (UID: "14b4aa8c-1066-4388-9442-07722e4c76c2") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279890 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-default-certificate podName:b9c0d96b-ed96-4925-b890-8743879a8b38 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.779871414 +0000 UTC m=+233.030043322 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-certificate" (UniqueName: "kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-default-certificate") pod "router-default-5444994796-gnnbl" (UID: "b9c0d96b-ed96-4925-b890-8743879a8b38") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279919 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b72bd4db-e5ea-44f6-bdce-81df2966acfb-srv-cert podName:b72bd4db-e5ea-44f6-bdce-81df2966acfb nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.779909355 +0000 UTC m=+233.030081343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/b72bd4db-e5ea-44f6-bdce-81df2966acfb-srv-cert") pod "catalog-operator-68c6474976-m9kmt" (UID: "b72bd4db-e5ea-44f6-bdce-81df2966acfb") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279948 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30a07c97-9d99-41be-956e-ba3d6505d318-srv-cert podName:30a07c97-9d99-41be-956e-ba3d6505d318 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.779933616 +0000 UTC m=+233.030105754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/30a07c97-9d99-41be-956e-ba3d6505d318-srv-cert") pod "olm-operator-6b444d44fb-ptbpd" (UID: "30a07c97-9d99-41be-956e-ba3d6505d318") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.281114 4764 configmap.go:193] Couldn't get configMap openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.281149 4764 secret.go:188] Couldn't get secret openshift-ingress/router-stats-default: failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.281178 4764 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.281195 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-stats-auth podName:b9c0d96b-ed96-4925-b890-8743879a8b38 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.78118266 +0000 UTC m=+233.031354668 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "stats-auth" (UniqueName: "kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-stats-auth") pod "router-default-5444994796-gnnbl" (UID: "b9c0d96b-ed96-4925-b890-8743879a8b38") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.281207 4764 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.281339 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/711447e6-e7cf-4577-8050-b5a391f96f6a-config podName:711447e6-e7cf-4577-8050-b5a391f96f6a nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.781285202 +0000 UTC m=+233.031457220 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/711447e6-e7cf-4577-8050-b5a391f96f6a-config") pod "openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" (UID: "711447e6-e7cf-4577-8050-b5a391f96f6a") : failed to sync configmap cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.281210 4764 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.281382 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-webhook-cert podName:14b4aa8c-1066-4388-9442-07722e4c76c2 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.781353824 +0000 UTC m=+233.031525732 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-webhook-cert") pod "packageserver-d55dfcdfc-m4bs6" (UID: "14b4aa8c-1066-4388-9442-07722e4c76c2") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.281405 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-operator-metrics podName:1ccc5b44-95ad-4f4c-8086-c176c41bbd19 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.781395595 +0000 UTC m=+233.031567493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-operator-metrics") pod "marketplace-operator-79b997595-d4gwh" (UID: "1ccc5b44-95ad-4f4c-8086-c176c41bbd19") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.281443 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-trusted-ca podName:1ccc5b44-95ad-4f4c-8086-c176c41bbd19 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.781428076 +0000 UTC m=+233.031599994 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-trusted-ca") pod "marketplace-operator-79b997595-d4gwh" (UID: "1ccc5b44-95ad-4f4c-8086-c176c41bbd19") : failed to sync configmap cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.296363 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.316191 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.337002 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.357031 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.377319 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.397166 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.416696 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.437216 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.442013 4764 generic.go:334] "Generic (PLEG): container finished" podID="1ffb8d96-e6e4-4859-ae7d-37f900979485" containerID="2327438a6c9ed8c1a989acd23de7dd26ca5827f91ab7de12fe2f05d2d7bc5774" exitCode=0 Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.442089 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" event={"ID":"1ffb8d96-e6e4-4859-ae7d-37f900979485","Type":"ContainerDied","Data":"2327438a6c9ed8c1a989acd23de7dd26ca5827f91ab7de12fe2f05d2d7bc5774"} Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.442140 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" event={"ID":"1ffb8d96-e6e4-4859-ae7d-37f900979485","Type":"ContainerStarted","Data":"1e3c772518ebee198daf071259de4022c5e3a22024e2e3619714d0e5c88c5454"} Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.456703 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.478768 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.498178 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.519329 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.537284 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.564692 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.578258 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.598037 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.617507 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.637250 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.656796 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.677011 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.696879 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.717174 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.736935 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.778185 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.796874 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.807280 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/30a07c97-9d99-41be-956e-ba3d6505d318-srv-cert\") pod \"olm-operator-6b444d44fb-ptbpd\" (UID: \"30a07c97-9d99-41be-956e-ba3d6505d318\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.807354 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.807415 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.807431 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-webhook-cert\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.807458 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711447e6-e7cf-4577-8050-b5a391f96f6a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xfvtf\" (UID: \"711447e6-e7cf-4577-8050-b5a391f96f6a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.807502 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-stats-auth\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.808951 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-default-certificate\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.809003 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9c0d96b-ed96-4925-b890-8743879a8b38-service-ca-bundle\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.809034 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b72bd4db-e5ea-44f6-bdce-81df2966acfb-srv-cert\") pod \"catalog-operator-68c6474976-m9kmt\" (UID: \"b72bd4db-e5ea-44f6-bdce-81df2966acfb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.809109 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-metrics-certs\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.809131 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-apiservice-cert\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.809169 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711447e6-e7cf-4577-8050-b5a391f96f6a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xfvtf\" (UID: \"711447e6-e7cf-4577-8050-b5a391f96f6a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.809545 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.810231 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9c0d96b-ed96-4925-b890-8743879a8b38-service-ca-bundle\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.812599 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.811891 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-webhook-cert\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.812779 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-stats-auth\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.813175 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-default-certificate\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.813197 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b72bd4db-e5ea-44f6-bdce-81df2966acfb-srv-cert\") pod \"catalog-operator-68c6474976-m9kmt\" (UID: \"b72bd4db-e5ea-44f6-bdce-81df2966acfb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.813266 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-apiservice-cert\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.813267 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-metrics-certs\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.813990 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/30a07c97-9d99-41be-956e-ba3d6505d318-srv-cert\") pod \"olm-operator-6b444d44fb-ptbpd\" (UID: \"30a07c97-9d99-41be-956e-ba3d6505d318\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.817611 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.836594 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.874760 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdsds\" (UniqueName: \"kubernetes.io/projected/f9b3244b-8df0-4330-9887-4092260d416a-kube-api-access-tdsds\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.891533 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x6ch\" (UniqueName: \"kubernetes.io/projected/baee6113-40a8-468e-b343-09e9afd65ce3-kube-api-access-8x6ch\") pod \"dns-operator-744455d44c-c2m6k\" (UID: \"baee6113-40a8-468e-b343-09e9afd65ce3\") " pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.910439 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgk9n\" (UniqueName: \"kubernetes.io/projected/6dc446a1-b77b-4f15-ae5f-0141bf374cdd-kube-api-access-pgk9n\") pod \"openshift-apiserver-operator-796bbdcf4f-k5dfc\" (UID: \"6dc446a1-b77b-4f15-ae5f-0141bf374cdd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.913530 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.940064 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p79zf\" (UniqueName: \"kubernetes.io/projected/b625331d-48ab-4d48-86fd-fe73466305ff-kube-api-access-p79zf\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.957495 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8fzv\" (UniqueName: \"kubernetes.io/projected/23f87d2b-2a92-4abb-a2a6-2de508837343-kube-api-access-j8fzv\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.958111 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.971864 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrcvk\" (UniqueName: \"kubernetes.io/projected/4951d770-ae8c-470a-982a-807c82112722-kube-api-access-lrcvk\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.993802 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k77xq\" (UniqueName: \"kubernetes.io/projected/cf3d7b2a-75e7-4c07-9211-b66c64c15def-kube-api-access-k77xq\") pod \"openshift-controller-manager-operator-756b6f6bc6-klv4l\" (UID: \"cf3d7b2a-75e7-4c07-9211-b66c64c15def\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.997891 4764 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.016790 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.037001 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.038469 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.068043 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.078801 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn7dv\" (UniqueName: \"kubernetes.io/projected/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-kube-api-access-vn7dv\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.101502 4764 request.go:700] Waited for 1.92812916s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/configmaps?fieldSelector=metadata.name%3Ddns-default&limit=500&resourceVersion=0 Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.103107 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.105521 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndfng\" (UniqueName: \"kubernetes.io/projected/61b85db0-a292-42a8-8296-d0e476d80c89-kube-api-access-ndfng\") pod \"openshift-config-operator-7777fb866f-fxv7j\" (UID: \"61b85db0-a292-42a8-8296-d0e476d80c89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.116747 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.117788 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c2m6k"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.120352 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.137397 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.151340 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.163763 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.175035 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48tpb\" (UniqueName: \"kubernetes.io/projected/ad49b4d8-1218-4a34-8455-831d0f563cbf-kube-api-access-48tpb\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.187231 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.193007 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc9tr\" (UniqueName: \"kubernetes.io/projected/4125448d-5832-43c2-8dba-d95adde7458a-kube-api-access-vc9tr\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.214988 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qxpr\" (UniqueName: \"kubernetes.io/projected/e025e897-5ff3-476b-81c9-afdd0ae7a25f-kube-api-access-2qxpr\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.225134 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ptjnd"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.231233 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e025e897-5ff3-476b-81c9-afdd0ae7a25f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.241400 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.250001 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.257228 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.269354 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f9zl\" (UniqueName: \"kubernetes.io/projected/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-kube-api-access-9f9zl\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.278519 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.293596 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tgqwl"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.300305 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 09 13:24:38 crc kubenswrapper[4764]: W0309 13:24:38.317869 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb625331d_48ab_4d48_86fd_fe73466305ff.slice/crio-22fd4fd8035fc57032bbb13916cd7b7736a6fafcf098f02294f9432a8954ab17 WatchSource:0}: Error finding container 22fd4fd8035fc57032bbb13916cd7b7736a6fafcf098f02294f9432a8954ab17: Status 404 returned error can't find the container with id 22fd4fd8035fc57032bbb13916cd7b7736a6fafcf098f02294f9432a8954ab17 Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.334790 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgwzj\" (UniqueName: \"kubernetes.io/projected/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-kube-api-access-bgwzj\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.369043 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.376414 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgbg9\" (UniqueName: \"kubernetes.io/projected/b72bd4db-e5ea-44f6-bdce-81df2966acfb-kube-api-access-mgbg9\") pod \"catalog-operator-68c6474976-m9kmt\" (UID: \"b72bd4db-e5ea-44f6-bdce-81df2966acfb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.394617 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.396392 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5qdx\" (UniqueName: \"kubernetes.io/projected/db0ec273-54d8-4753-b519-243b727a9efd-kube-api-access-f5qdx\") pod \"package-server-manager-789f6589d5-b2pz8\" (UID: \"db0ec273-54d8-4753-b519-243b727a9efd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.416248 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g47h8\" (UniqueName: \"kubernetes.io/projected/0a005f65-920a-4cdd-b4da-a270953113aa-kube-api-access-g47h8\") pod \"auto-csr-approver-29551044-p748f\" (UID: \"0a005f65-920a-4cdd-b4da-a270953113aa\") " pod="openshift-infra/auto-csr-approver-29551044-p748f" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.429209 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.452429 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpcrn\" (UniqueName: \"kubernetes.io/projected/4912a02a-743d-4bbe-9063-7d99ccd3329a-kube-api-access-jpcrn\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:38 crc kubenswrapper[4764]: W0309 13:24:38.456867 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf3d7b2a_75e7_4c07_9211_b66c64c15def.slice/crio-3a2f5d25288e8bc86cb3cee224bc80c295dc724258bc8041325b003c8ff187db WatchSource:0}: Error finding container 3a2f5d25288e8bc86cb3cee224bc80c295dc724258bc8041325b003c8ff187db: Status 404 returned error can't find the container with id 3a2f5d25288e8bc86cb3cee224bc80c295dc724258bc8041325b003c8ff187db Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.459280 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdqjn\" (UniqueName: \"kubernetes.io/projected/4ba55602-0e3f-4722-b437-546732351bc4-kube-api-access-fdqjn\") pod \"control-plane-machine-set-operator-78cbb6b69f-9k28f\" (UID: \"4ba55602-0e3f-4722-b437-546732351bc4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.462908 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" event={"ID":"1ffb8d96-e6e4-4859-ae7d-37f900979485","Type":"ContainerStarted","Data":"c416f2dc40385063724840456179604ca4883ff273e05587d4d08e7c6e5aa92a"} Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.462952 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" event={"ID":"1ffb8d96-e6e4-4859-ae7d-37f900979485","Type":"ContainerStarted","Data":"44e73e4e6b5264ba8304d7c2411b0e137ff9d4aa3edb572b0be2d425d0cdc0e1"} Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.463909 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" event={"ID":"baee6113-40a8-468e-b343-09e9afd65ce3","Type":"ContainerStarted","Data":"6fb4329e2fa9390478b8f6dd1725de8444e1017e3158aebde222bd496b98fb80"} Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.469289 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" event={"ID":"4951d770-ae8c-470a-982a-807c82112722","Type":"ContainerStarted","Data":"71223c683cd0648308c14fd45ca542bb5d131f54ba2d79963157873d6c25e727"} Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.473590 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2l2r\" (UniqueName: \"kubernetes.io/projected/ad307984-e46b-466b-8a5c-63a00976fbbf-kube-api-access-b2l2r\") pod \"cluster-samples-operator-665b6dd947-5g2pp\" (UID: \"ad307984-e46b-466b-8a5c-63a00976fbbf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.482552 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.485171 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" event={"ID":"6dc446a1-b77b-4f15-ae5f-0141bf374cdd","Type":"ContainerStarted","Data":"17d5b491d947e91ddf28c82de6dbc1281f0bec8b013043771e2e286b4c427bba"} Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.489322 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" event={"ID":"b625331d-48ab-4d48-86fd-fe73466305ff","Type":"ContainerStarted","Data":"22fd4fd8035fc57032bbb13916cd7b7736a6fafcf098f02294f9432a8954ab17"} Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.491633 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0959a00-2a83-457f-bcba-7d4af48b11c3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tvsss\" (UID: \"d0959a00-2a83-457f-bcba-7d4af48b11c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.497018 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.509298 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nj856"] Mar 09 13:24:38 crc kubenswrapper[4764]: W0309 13:24:38.511793 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad49b4d8_1218_4a34_8455_831d0f563cbf.slice/crio-3ebce9784954b6d704d045dc1cbbc83b452607e4967b3e3637a00f82a3e69729 WatchSource:0}: Error finding container 3ebce9784954b6d704d045dc1cbbc83b452607e4967b3e3637a00f82a3e69729: Status 404 returned error can't find the container with id 3ebce9784954b6d704d045dc1cbbc83b452607e4967b3e3637a00f82a3e69729 Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.518962 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.519595 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67pl5\" (UniqueName: \"kubernetes.io/projected/b9c0d96b-ed96-4925-b890-8743879a8b38-kube-api-access-67pl5\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.529157 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.540891 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/711447e6-e7cf-4577-8050-b5a391f96f6a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xfvtf\" (UID: \"711447e6-e7cf-4577-8050-b5a391f96f6a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.541091 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551044-p748f" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.550829 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.556456 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss45h\" (UniqueName: \"kubernetes.io/projected/39da5087-79bc-4154-b340-22183d9e4417-kube-api-access-ss45h\") pod \"collect-profiles-29551035-5fpgn\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.562936 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nnrmm"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.566463 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" Mar 09 13:24:38 crc kubenswrapper[4764]: W0309 13:24:38.573330 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9b3244b_8df0_4330_9887_4092260d416a.slice/crio-42c43e046cf7b3c37122d69fdadfdf37b294da8c4d73ad8e7c4ac09039e43f1c WatchSource:0}: Error finding container 42c43e046cf7b3c37122d69fdadfdf37b294da8c4d73ad8e7c4ac09039e43f1c: Status 404 returned error can't find the container with id 42c43e046cf7b3c37122d69fdadfdf37b294da8c4d73ad8e7c4ac09039e43f1c Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.582873 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s85j9\" (UniqueName: \"kubernetes.io/projected/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-kube-api-access-s85j9\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:38 crc kubenswrapper[4764]: W0309 13:24:38.593141 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23f87d2b_2a92_4abb_a2a6_2de508837343.slice/crio-06f569219e270618320b5e9344203a70dabe7cff34d14271dabece2d5bfeef8b WatchSource:0}: Error finding container 06f569219e270618320b5e9344203a70dabe7cff34d14271dabece2d5bfeef8b: Status 404 returned error can't find the container with id 06f569219e270618320b5e9344203a70dabe7cff34d14271dabece2d5bfeef8b Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.594495 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9vzq\" (UniqueName: \"kubernetes.io/projected/14b4aa8c-1066-4388-9442-07722e4c76c2-kube-api-access-w9vzq\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.617035 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.634244 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdclx\" (UniqueName: \"kubernetes.io/projected/30a07c97-9d99-41be-956e-ba3d6505d318-kube-api-access-rdclx\") pod \"olm-operator-6b444d44fb-ptbpd\" (UID: \"30a07c97-9d99-41be-956e-ba3d6505d318\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.667783 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.668924 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.669465 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.669877 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.673840 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnv7g\" (UniqueName: \"kubernetes.io/projected/8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497-kube-api-access-vnv7g\") pod \"service-ca-operator-777779d784-svr2n\" (UID: \"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:38 crc kubenswrapper[4764]: W0309 13:24:38.683129 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf95ed010_a6a4_49ab_b61b_fc4ee2d856bb.slice/crio-cfd2c52e9451edb42c1574c3bfe44c1c8e89acf647b496a5ee4bfe6db1b87112 WatchSource:0}: Error finding container cfd2c52e9451edb42c1574c3bfe44c1c8e89acf647b496a5ee4bfe6db1b87112: Status 404 returned error can't find the container with id cfd2c52e9451edb42c1574c3bfe44c1c8e89acf647b496a5ee4bfe6db1b87112 Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.691297 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" Mar 09 13:24:38 crc kubenswrapper[4764]: W0309 13:24:38.718976 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61b85db0_a292_42a8_8296_d0e476d80c89.slice/crio-f998595181102b6c527dff000f09d875d8d715e5310d9ab7d8ba0e72dc742c89 WatchSource:0}: Error finding container f998595181102b6c527dff000f09d875d8d715e5310d9ab7d8ba0e72dc742c89: Status 404 returned error can't find the container with id f998595181102b6c527dff000f09d875d8d715e5310d9ab7d8ba0e72dc742c89 Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.723408 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t8ft9"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726075 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4f31f9d-d2da-4e3d-b002-2f477d344fc0-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kt67m\" (UID: \"f4f31f9d-d2da-4e3d-b002-2f477d344fc0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726123 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpnqx\" (UniqueName: \"kubernetes.io/projected/2c795327-bff9-453d-b1b2-d48a2ef2b48f-kube-api-access-gpnqx\") pod \"migrator-59844c95c7-b6nrf\" (UID: \"2c795327-bff9-453d-b1b2-d48a2ef2b48f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726143 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fe7eff7-ce11-4d46-bf25-06162522c1ff-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6brcn\" (UID: \"1fe7eff7-ce11-4d46-bf25-06162522c1ff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726161 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdhqw\" (UniqueName: \"kubernetes.io/projected/2968d8f2-48fa-470b-b90e-41bb83bd77c4-kube-api-access-bdhqw\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726177 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fe7eff7-ce11-4d46-bf25-06162522c1ff-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6brcn\" (UID: \"1fe7eff7-ce11-4d46-bf25-06162522c1ff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726252 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st8mw\" (UniqueName: \"kubernetes.io/projected/d245b116-e47d-4b15-a40d-0a9fa34cf1df-kube-api-access-st8mw\") pod \"downloads-7954f5f757-x927s\" (UID: \"d245b116-e47d-4b15-a40d-0a9fa34cf1df\") " pod="openshift-console/downloads-7954f5f757-x927s" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726297 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/24df0ad8-c9b4-46b6-8751-23e26fc391c5-signing-cabundle\") pod \"service-ca-9c57cc56f-qddjs\" (UID: \"24df0ad8-c9b4-46b6-8751-23e26fc391c5\") " pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726328 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d3652fe0-4889-432f-af3f-787dd19c60d6-installation-pull-secrets\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726361 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/24df0ad8-c9b4-46b6-8751-23e26fc391c5-signing-key\") pod \"service-ca-9c57cc56f-qddjs\" (UID: \"24df0ad8-c9b4-46b6-8751-23e26fc391c5\") " pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726386 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-bound-sa-token\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726401 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tw8z\" (UniqueName: \"kubernetes.io/projected/f4f31f9d-d2da-4e3d-b002-2f477d344fc0-kube-api-access-7tw8z\") pod \"machine-config-controller-84d6567774-kt67m\" (UID: \"f4f31f9d-d2da-4e3d-b002-2f477d344fc0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726431 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-trusted-ca\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726469 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4f31f9d-d2da-4e3d-b002-2f477d344fc0-proxy-tls\") pod \"machine-config-controller-84d6567774-kt67m\" (UID: \"f4f31f9d-d2da-4e3d-b002-2f477d344fc0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726486 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36228b47-79c7-484d-9753-6f36806aa344-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mn5w6\" (UID: \"36228b47-79c7-484d-9753-6f36806aa344\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726525 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39249ace-8681-491c-a547-869e72d297a7-metrics-tls\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726550 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-certificates\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726575 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726606 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36228b47-79c7-484d-9753-6f36806aa344-config\") pod \"kube-controller-manager-operator-78b949d7b-mn5w6\" (UID: \"36228b47-79c7-484d-9753-6f36806aa344\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726636 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hf98d\" (UID: \"957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726688 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4js2s\" (UniqueName: \"kubernetes.io/projected/957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4-kube-api-access-4js2s\") pod \"multus-admission-controller-857f4d67dd-hf98d\" (UID: \"957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726724 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftcvk\" (UniqueName: \"kubernetes.io/projected/39249ace-8681-491c-a547-869e72d297a7-kube-api-access-ftcvk\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726743 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsz44\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-kube-api-access-xsz44\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726769 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l68gm\" (UniqueName: \"kubernetes.io/projected/24df0ad8-c9b4-46b6-8751-23e26fc391c5-kube-api-access-l68gm\") pod \"service-ca-9c57cc56f-qddjs\" (UID: \"24df0ad8-c9b4-46b6-8751-23e26fc391c5\") " pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726783 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2968d8f2-48fa-470b-b90e-41bb83bd77c4-auth-proxy-config\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726817 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2968d8f2-48fa-470b-b90e-41bb83bd77c4-machine-approver-tls\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726834 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2968d8f2-48fa-470b-b90e-41bb83bd77c4-config\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726852 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39249ace-8681-491c-a547-869e72d297a7-trusted-ca\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726866 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39249ace-8681-491c-a547-869e72d297a7-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726883 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6ztl\" (UniqueName: \"kubernetes.io/projected/1fe7eff7-ce11-4d46-bf25-06162522c1ff-kube-api-access-q6ztl\") pod \"kube-storage-version-migrator-operator-b67b599dd-6brcn\" (UID: \"1fe7eff7-ce11-4d46-bf25-06162522c1ff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726899 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36228b47-79c7-484d-9753-6f36806aa344-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mn5w6\" (UID: \"36228b47-79c7-484d-9753-6f36806aa344\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726924 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-tls\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726945 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d3652fe0-4889-432f-af3f-787dd19c60d6-ca-trust-extracted\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: E0309 13:24:38.730464 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.230451675 +0000 UTC m=+234.480623583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.735864 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.755265 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.772214 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.783362 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sp2mq"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.786970 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.796310 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.804086 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.819872 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.830834 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:38 crc kubenswrapper[4764]: E0309 13:24:38.830953 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.330927416 +0000 UTC m=+234.581099324 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.831103 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c464093-f30e-4c79-84cd-98a6d49b813c-metrics-tls\") pod \"dns-default-6nhpx\" (UID: \"0c464093-f30e-4c79-84cd-98a6d49b813c\") " pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.831155 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2968d8f2-48fa-470b-b90e-41bb83bd77c4-machine-approver-tls\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.831177 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2968d8f2-48fa-470b-b90e-41bb83bd77c4-config\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.831219 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxvw8\" (UniqueName: \"kubernetes.io/projected/0c464093-f30e-4c79-84cd-98a6d49b813c-kube-api-access-kxvw8\") pod \"dns-default-6nhpx\" (UID: \"0c464093-f30e-4c79-84cd-98a6d49b813c\") " pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.831246 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39249ace-8681-491c-a547-869e72d297a7-trusted-ca\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.831945 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39249ace-8681-491c-a547-869e72d297a7-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.831968 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6ztl\" (UniqueName: \"kubernetes.io/projected/1fe7eff7-ce11-4d46-bf25-06162522c1ff-kube-api-access-q6ztl\") pod \"kube-storage-version-migrator-operator-b67b599dd-6brcn\" (UID: \"1fe7eff7-ce11-4d46-bf25-06162522c1ff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.831994 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36228b47-79c7-484d-9753-6f36806aa344-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mn5w6\" (UID: \"36228b47-79c7-484d-9753-6f36806aa344\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832037 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-tls\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832065 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d3652fe0-4889-432f-af3f-787dd19c60d6-ca-trust-extracted\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832127 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvm6v\" (UniqueName: \"kubernetes.io/projected/e56f29e6-292c-48d2-a5d7-531cfa6689f5-kube-api-access-qvm6v\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832213 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4f31f9d-d2da-4e3d-b002-2f477d344fc0-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kt67m\" (UID: \"f4f31f9d-d2da-4e3d-b002-2f477d344fc0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832375 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-socket-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832404 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpnqx\" (UniqueName: \"kubernetes.io/projected/2c795327-bff9-453d-b1b2-d48a2ef2b48f-kube-api-access-gpnqx\") pod \"migrator-59844c95c7-b6nrf\" (UID: \"2c795327-bff9-453d-b1b2-d48a2ef2b48f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832428 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fe7eff7-ce11-4d46-bf25-06162522c1ff-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6brcn\" (UID: \"1fe7eff7-ce11-4d46-bf25-06162522c1ff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832452 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c464093-f30e-4c79-84cd-98a6d49b813c-config-volume\") pod \"dns-default-6nhpx\" (UID: \"0c464093-f30e-4c79-84cd-98a6d49b813c\") " pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832531 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-mountpoint-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832554 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fe7eff7-ce11-4d46-bf25-06162522c1ff-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6brcn\" (UID: \"1fe7eff7-ce11-4d46-bf25-06162522c1ff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832607 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdhqw\" (UniqueName: \"kubernetes.io/projected/2968d8f2-48fa-470b-b90e-41bb83bd77c4-kube-api-access-bdhqw\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832685 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st8mw\" (UniqueName: \"kubernetes.io/projected/d245b116-e47d-4b15-a40d-0a9fa34cf1df-kube-api-access-st8mw\") pod \"downloads-7954f5f757-x927s\" (UID: \"d245b116-e47d-4b15-a40d-0a9fa34cf1df\") " pod="openshift-console/downloads-7954f5f757-x927s" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832734 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrqh5\" (UniqueName: \"kubernetes.io/projected/b7bf830c-e91a-4dd1-a3b5-64ca95d57e44-kube-api-access-mrqh5\") pod \"ingress-canary-k96kg\" (UID: \"b7bf830c-e91a-4dd1-a3b5-64ca95d57e44\") " pod="openshift-ingress-canary/ingress-canary-k96kg" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832762 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/24df0ad8-c9b4-46b6-8751-23e26fc391c5-signing-cabundle\") pod \"service-ca-9c57cc56f-qddjs\" (UID: \"24df0ad8-c9b4-46b6-8751-23e26fc391c5\") " pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832841 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d3652fe0-4889-432f-af3f-787dd19c60d6-installation-pull-secrets\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832877 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-csi-data-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832925 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/24df0ad8-c9b4-46b6-8751-23e26fc391c5-signing-key\") pod \"service-ca-9c57cc56f-qddjs\" (UID: \"24df0ad8-c9b4-46b6-8751-23e26fc391c5\") " pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832951 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-bound-sa-token\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832972 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tw8z\" (UniqueName: \"kubernetes.io/projected/f4f31f9d-d2da-4e3d-b002-2f477d344fc0-kube-api-access-7tw8z\") pod \"machine-config-controller-84d6567774-kt67m\" (UID: \"f4f31f9d-d2da-4e3d-b002-2f477d344fc0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833040 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7bf830c-e91a-4dd1-a3b5-64ca95d57e44-cert\") pod \"ingress-canary-k96kg\" (UID: \"b7bf830c-e91a-4dd1-a3b5-64ca95d57e44\") " pod="openshift-ingress-canary/ingress-canary-k96kg" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833145 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-trusted-ca\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833170 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4f31f9d-d2da-4e3d-b002-2f477d344fc0-proxy-tls\") pod \"machine-config-controller-84d6567774-kt67m\" (UID: \"f4f31f9d-d2da-4e3d-b002-2f477d344fc0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833241 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36228b47-79c7-484d-9753-6f36806aa344-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mn5w6\" (UID: \"36228b47-79c7-484d-9753-6f36806aa344\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833277 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-plugins-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833322 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8e5162b2-c722-4535-9adf-3af0eee24211-node-bootstrap-token\") pod \"machine-config-server-kdxg4\" (UID: \"8e5162b2-c722-4535-9adf-3af0eee24211\") " pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833343 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8e5162b2-c722-4535-9adf-3af0eee24211-certs\") pod \"machine-config-server-kdxg4\" (UID: \"8e5162b2-c722-4535-9adf-3af0eee24211\") " pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833362 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhg8d\" (UniqueName: \"kubernetes.io/projected/8e5162b2-c722-4535-9adf-3af0eee24211-kube-api-access-nhg8d\") pod \"machine-config-server-kdxg4\" (UID: \"8e5162b2-c722-4535-9adf-3af0eee24211\") " pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833399 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39249ace-8681-491c-a547-869e72d297a7-metrics-tls\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833434 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-certificates\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833464 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36228b47-79c7-484d-9753-6f36806aa344-config\") pod \"kube-controller-manager-operator-78b949d7b-mn5w6\" (UID: \"36228b47-79c7-484d-9753-6f36806aa344\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833516 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833552 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hf98d\" (UID: \"957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833620 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4js2s\" (UniqueName: \"kubernetes.io/projected/957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4-kube-api-access-4js2s\") pod \"multus-admission-controller-857f4d67dd-hf98d\" (UID: \"957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833689 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftcvk\" (UniqueName: \"kubernetes.io/projected/39249ace-8681-491c-a547-869e72d297a7-kube-api-access-ftcvk\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833726 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsz44\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-kube-api-access-xsz44\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833775 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-registration-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833797 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2968d8f2-48fa-470b-b90e-41bb83bd77c4-auth-proxy-config\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833834 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l68gm\" (UniqueName: \"kubernetes.io/projected/24df0ad8-c9b4-46b6-8751-23e26fc391c5-kube-api-access-l68gm\") pod \"service-ca-9c57cc56f-qddjs\" (UID: \"24df0ad8-c9b4-46b6-8751-23e26fc391c5\") " pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.835150 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2968d8f2-48fa-470b-b90e-41bb83bd77c4-config\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.835204 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4f31f9d-d2da-4e3d-b002-2f477d344fc0-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kt67m\" (UID: \"f4f31f9d-d2da-4e3d-b002-2f477d344fc0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.835770 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2968d8f2-48fa-470b-b90e-41bb83bd77c4-machine-approver-tls\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.837936 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fe7eff7-ce11-4d46-bf25-06162522c1ff-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6brcn\" (UID: \"1fe7eff7-ce11-4d46-bf25-06162522c1ff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:38 crc kubenswrapper[4764]: E0309 13:24:38.844018 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.344004098 +0000 UTC m=+234.594176006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.857555 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d3652fe0-4889-432f-af3f-787dd19c60d6-ca-trust-extracted\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.862088 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36228b47-79c7-484d-9753-6f36806aa344-config\") pod \"kube-controller-manager-operator-78b949d7b-mn5w6\" (UID: \"36228b47-79c7-484d-9753-6f36806aa344\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.868423 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/24df0ad8-c9b4-46b6-8751-23e26fc391c5-signing-cabundle\") pod \"service-ca-9c57cc56f-qddjs\" (UID: \"24df0ad8-c9b4-46b6-8751-23e26fc391c5\") " pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.869522 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36228b47-79c7-484d-9753-6f36806aa344-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mn5w6\" (UID: \"36228b47-79c7-484d-9753-6f36806aa344\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.870569 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4f31f9d-d2da-4e3d-b002-2f477d344fc0-proxy-tls\") pod \"machine-config-controller-84d6567774-kt67m\" (UID: \"f4f31f9d-d2da-4e3d-b002-2f477d344fc0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.879969 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l68gm\" (UniqueName: \"kubernetes.io/projected/24df0ad8-c9b4-46b6-8751-23e26fc391c5-kube-api-access-l68gm\") pod \"service-ca-9c57cc56f-qddjs\" (UID: \"24df0ad8-c9b4-46b6-8751-23e26fc391c5\") " pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.880226 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-certificates\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.880573 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fe7eff7-ce11-4d46-bf25-06162522c1ff-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6brcn\" (UID: \"1fe7eff7-ce11-4d46-bf25-06162522c1ff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.881006 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2968d8f2-48fa-470b-b90e-41bb83bd77c4-auth-proxy-config\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.881431 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-tls\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.881576 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39249ace-8681-491c-a547-869e72d297a7-trusted-ca\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.882402 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39249ace-8681-491c-a547-869e72d297a7-metrics-tls\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.882804 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39249ace-8681-491c-a547-869e72d297a7-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.883072 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-trusted-ca\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.884823 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/24df0ad8-c9b4-46b6-8751-23e26fc391c5-signing-key\") pod \"service-ca-9c57cc56f-qddjs\" (UID: \"24df0ad8-c9b4-46b6-8751-23e26fc391c5\") " pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.885587 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d3652fe0-4889-432f-af3f-787dd19c60d6-installation-pull-secrets\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.905108 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hf98d\" (UID: \"957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.925218 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6ztl\" (UniqueName: \"kubernetes.io/projected/1fe7eff7-ce11-4d46-bf25-06162522c1ff-kube-api-access-q6ztl\") pod \"kube-storage-version-migrator-operator-b67b599dd-6brcn\" (UID: \"1fe7eff7-ce11-4d46-bf25-06162522c1ff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.954622 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpnqx\" (UniqueName: \"kubernetes.io/projected/2c795327-bff9-453d-b1b2-d48a2ef2b48f-kube-api-access-gpnqx\") pod \"migrator-59844c95c7-b6nrf\" (UID: \"2c795327-bff9-453d-b1b2-d48a2ef2b48f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955127 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955354 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-socket-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955380 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c464093-f30e-4c79-84cd-98a6d49b813c-config-volume\") pod \"dns-default-6nhpx\" (UID: \"0c464093-f30e-4c79-84cd-98a6d49b813c\") " pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955403 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-mountpoint-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955446 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrqh5\" (UniqueName: \"kubernetes.io/projected/b7bf830c-e91a-4dd1-a3b5-64ca95d57e44-kube-api-access-mrqh5\") pod \"ingress-canary-k96kg\" (UID: \"b7bf830c-e91a-4dd1-a3b5-64ca95d57e44\") " pod="openshift-ingress-canary/ingress-canary-k96kg" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955472 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-csi-data-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955512 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7bf830c-e91a-4dd1-a3b5-64ca95d57e44-cert\") pod \"ingress-canary-k96kg\" (UID: \"b7bf830c-e91a-4dd1-a3b5-64ca95d57e44\") " pod="openshift-ingress-canary/ingress-canary-k96kg" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955537 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-plugins-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955562 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8e5162b2-c722-4535-9adf-3af0eee24211-node-bootstrap-token\") pod \"machine-config-server-kdxg4\" (UID: \"8e5162b2-c722-4535-9adf-3af0eee24211\") " pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955581 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8e5162b2-c722-4535-9adf-3af0eee24211-certs\") pod \"machine-config-server-kdxg4\" (UID: \"8e5162b2-c722-4535-9adf-3af0eee24211\") " pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955603 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhg8d\" (UniqueName: \"kubernetes.io/projected/8e5162b2-c722-4535-9adf-3af0eee24211-kube-api-access-nhg8d\") pod \"machine-config-server-kdxg4\" (UID: \"8e5162b2-c722-4535-9adf-3af0eee24211\") " pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955709 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-registration-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955736 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c464093-f30e-4c79-84cd-98a6d49b813c-metrics-tls\") pod \"dns-default-6nhpx\" (UID: \"0c464093-f30e-4c79-84cd-98a6d49b813c\") " pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955759 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxvw8\" (UniqueName: \"kubernetes.io/projected/0c464093-f30e-4c79-84cd-98a6d49b813c-kube-api-access-kxvw8\") pod \"dns-default-6nhpx\" (UID: \"0c464093-f30e-4c79-84cd-98a6d49b813c\") " pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955805 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvm6v\" (UniqueName: \"kubernetes.io/projected/e56f29e6-292c-48d2-a5d7-531cfa6689f5-kube-api-access-qvm6v\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: E0309 13:24:38.956051 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.4560324 +0000 UTC m=+234.706204308 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.956260 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-socket-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.956925 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c464093-f30e-4c79-84cd-98a6d49b813c-config-volume\") pod \"dns-default-6nhpx\" (UID: \"0c464093-f30e-4c79-84cd-98a6d49b813c\") " pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.956943 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-csi-data-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.957149 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-mountpoint-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.973484 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-plugins-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.974276 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8e5162b2-c722-4535-9adf-3af0eee24211-certs\") pod \"machine-config-server-kdxg4\" (UID: \"8e5162b2-c722-4535-9adf-3af0eee24211\") " pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.974977 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8e5162b2-c722-4535-9adf-3af0eee24211-node-bootstrap-token\") pod \"machine-config-server-kdxg4\" (UID: \"8e5162b2-c722-4535-9adf-3af0eee24211\") " pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.976117 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7bf830c-e91a-4dd1-a3b5-64ca95d57e44-cert\") pod \"ingress-canary-k96kg\" (UID: \"b7bf830c-e91a-4dd1-a3b5-64ca95d57e44\") " pod="openshift-ingress-canary/ingress-canary-k96kg" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.978835 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.979098 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-registration-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.990321 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-bound-sa-token\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.990328 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st8mw\" (UniqueName: \"kubernetes.io/projected/d245b116-e47d-4b15-a40d-0a9fa34cf1df-kube-api-access-st8mw\") pod \"downloads-7954f5f757-x927s\" (UID: \"d245b116-e47d-4b15-a40d-0a9fa34cf1df\") " pod="openshift-console/downloads-7954f5f757-x927s" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.997525 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c464093-f30e-4c79-84cd-98a6d49b813c-metrics-tls\") pod \"dns-default-6nhpx\" (UID: \"0c464093-f30e-4c79-84cd-98a6d49b813c\") " pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.002124 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdhqw\" (UniqueName: \"kubernetes.io/projected/2968d8f2-48fa-470b-b90e-41bb83bd77c4-kube-api-access-bdhqw\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.012164 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8g9lj"] Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.018122 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tw8z\" (UniqueName: \"kubernetes.io/projected/f4f31f9d-d2da-4e3d-b002-2f477d344fc0-kube-api-access-7tw8z\") pod \"machine-config-controller-84d6567774-kt67m\" (UID: \"f4f31f9d-d2da-4e3d-b002-2f477d344fc0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.023248 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.031090 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36228b47-79c7-484d-9753-6f36806aa344-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mn5w6\" (UID: \"36228b47-79c7-484d-9753-6f36806aa344\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.045884 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.060560 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.061682 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.56166625 +0000 UTC m=+234.811838158 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.075870 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4js2s\" (UniqueName: \"kubernetes.io/projected/957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4-kube-api-access-4js2s\") pod \"multus-admission-controller-857f4d67dd-hf98d\" (UID: \"957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.087963 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftcvk\" (UniqueName: \"kubernetes.io/projected/39249ace-8681-491c-a547-869e72d297a7-kube-api-access-ftcvk\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.089667 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt"] Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.108694 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsz44\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-kube-api-access-xsz44\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.114172 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:39 crc kubenswrapper[4764]: W0309 13:24:39.128922 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1b4dc0b_edea_4c0d_8d61_3e3d3133605d.slice/crio-74b142593d2c61fe165941847cdc507401c02d77a28ca82096827ce65a85b3e5 WatchSource:0}: Error finding container 74b142593d2c61fe165941847cdc507401c02d77a28ca82096827ce65a85b3e5: Status 404 returned error can't find the container with id 74b142593d2c61fe165941847cdc507401c02d77a28ca82096827ce65a85b3e5 Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.148343 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvm6v\" (UniqueName: \"kubernetes.io/projected/e56f29e6-292c-48d2-a5d7-531cfa6689f5-kube-api-access-qvm6v\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.162601 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.162581 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.662556842 +0000 UTC m=+234.912728750 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.162872 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrqh5\" (UniqueName: \"kubernetes.io/projected/b7bf830c-e91a-4dd1-a3b5-64ca95d57e44-kube-api-access-mrqh5\") pod \"ingress-canary-k96kg\" (UID: \"b7bf830c-e91a-4dd1-a3b5-64ca95d57e44\") " pod="openshift-ingress-canary/ingress-canary-k96kg" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.163119 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.164336 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.66432523 +0000 UTC m=+234.914497148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.165911 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.187396 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhg8d\" (UniqueName: \"kubernetes.io/projected/8e5162b2-c722-4535-9adf-3af0eee24211-kube-api-access-nhg8d\") pod \"machine-config-server-kdxg4\" (UID: \"8e5162b2-c722-4535-9adf-3af0eee24211\") " pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.191346 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.215465 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-x927s" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.221558 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.228483 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxvw8\" (UniqueName: \"kubernetes.io/projected/0c464093-f30e-4c79-84cd-98a6d49b813c-kube-api-access-kxvw8\") pod \"dns-default-6nhpx\" (UID: \"0c464093-f30e-4c79-84cd-98a6d49b813c\") " pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.242203 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.265682 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.265816 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.765785118 +0000 UTC m=+235.015957026 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.266324 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.266954 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.766943059 +0000 UTC m=+235.017114967 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.282993 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.363524 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.367694 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.368203 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.868187911 +0000 UTC m=+235.118359819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: W0309 13:24:39.373423 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb72bd4db_e5ea_44f6_bdce_81df2966acfb.slice/crio-c18c57b8ec43425645ca5f35f11b40303b55e73502767f0420d6ee129a178222 WatchSource:0}: Error finding container c18c57b8ec43425645ca5f35f11b40303b55e73502767f0420d6ee129a178222: Status 404 returned error can't find the container with id c18c57b8ec43425645ca5f35f11b40303b55e73502767f0420d6ee129a178222 Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.460071 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k96kg" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.477069 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.477448 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.977433508 +0000 UTC m=+235.227605416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.482352 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.498628 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551044-p748f"] Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.578194 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.579019 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:40.079000999 +0000 UTC m=+235.329172917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.579235 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.579564 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:40.079544484 +0000 UTC m=+235.329716392 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.580739 4764 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-tgqwl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.580792 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" podUID="b625331d-48ab-4d48-86fd-fe73466305ff" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.592982 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" event={"ID":"b625331d-48ab-4d48-86fd-fe73466305ff","Type":"ContainerStarted","Data":"878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.593204 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" event={"ID":"f9b3244b-8df0-4330-9887-4092260d416a","Type":"ContainerStarted","Data":"42c43e046cf7b3c37122d69fdadfdf37b294da8c4d73ad8e7c4ac09039e43f1c"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.593332 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp"] Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.593452 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.593602 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk"] Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.593773 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448"] Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.596638 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sp2mq" event={"ID":"ee2ad8bf-7cf9-4bab-9638-b26d9c593188","Type":"ContainerStarted","Data":"a839636883aa4f201eff1e2771eff45834e9fd880c3edbdfd1093253a12cbf1a"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.608604 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" event={"ID":"cf3d7b2a-75e7-4c07-9211-b66c64c15def","Type":"ContainerStarted","Data":"f44602f0fd06fc5b4a5856699a9ae61fb2b61106450e70eb3f6104cfbee6dfe5"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.608719 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" event={"ID":"cf3d7b2a-75e7-4c07-9211-b66c64c15def","Type":"ContainerStarted","Data":"3a2f5d25288e8bc86cb3cee224bc80c295dc724258bc8041325b003c8ff187db"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.610278 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn"] Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.613094 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" event={"ID":"61b85db0-a292-42a8-8296-d0e476d80c89","Type":"ContainerStarted","Data":"f998595181102b6c527dff000f09d875d8d715e5310d9ab7d8ba0e72dc742c89"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.615695 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" event={"ID":"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb","Type":"ContainerStarted","Data":"cfd2c52e9451edb42c1574c3bfe44c1c8e89acf647b496a5ee4bfe6db1b87112"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.621089 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" event={"ID":"23f87d2b-2a92-4abb-a2a6-2de508837343","Type":"ContainerStarted","Data":"06f569219e270618320b5e9344203a70dabe7cff34d14271dabece2d5bfeef8b"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.647058 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" event={"ID":"baee6113-40a8-468e-b343-09e9afd65ce3","Type":"ContainerStarted","Data":"ef865bd11e36a99798afe2b99b9e97135ef2eab7e755d81a456c363efcf457a1"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.650513 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" event={"ID":"b72bd4db-e5ea-44f6-bdce-81df2966acfb","Type":"ContainerStarted","Data":"c18c57b8ec43425645ca5f35f11b40303b55e73502767f0420d6ee129a178222"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.652417 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.654313 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" event={"ID":"ad49b4d8-1218-4a34-8455-831d0f563cbf","Type":"ContainerStarted","Data":"87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.654370 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" event={"ID":"ad49b4d8-1218-4a34-8455-831d0f563cbf","Type":"ContainerStarted","Data":"3ebce9784954b6d704d045dc1cbbc83b452607e4967b3e3637a00f82a3e69729"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.655250 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.662067 4764 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-mvq6r container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.662131 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" podUID="ad49b4d8-1218-4a34-8455-831d0f563cbf" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.662639 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" event={"ID":"4125448d-5832-43c2-8dba-d95adde7458a","Type":"ContainerStarted","Data":"9dbae435a7eb2868f80bef0b5a8774f6a265623101848e72196b767d1eafe722"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.667256 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" event={"ID":"4951d770-ae8c-470a-982a-807c82112722","Type":"ContainerStarted","Data":"71eee8dfc7e240ec93ee7fb35f30941ec5a5be0356b37b1332a7ecab80f5293c"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.676046 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" event={"ID":"6dc446a1-b77b-4f15-ae5f-0141bf374cdd","Type":"ContainerStarted","Data":"263920c9620557294e3ee0141e1581d377009113573b651bea6f701776236ba7"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.677854 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gnnbl" event={"ID":"b9c0d96b-ed96-4925-b890-8743879a8b38","Type":"ContainerStarted","Data":"e698665a8d11a1629142c5086f14482532d66300a5375d8c56a007f248de02c1"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.691271 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.695160 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:40.195124471 +0000 UTC m=+235.445296369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.697949 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8g9lj" event={"ID":"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d","Type":"ContainerStarted","Data":"74b142593d2c61fe165941847cdc507401c02d77a28ca82096827ce65a85b3e5"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.793546 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.794907 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:40.294892484 +0000 UTC m=+235.545064462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: W0309 13:24:39.871898 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode025e897_5ff3_476b_81c9_afdd0ae7a25f.slice/crio-096479a1922d69082b1ef9b9e58ac118c8a36513f53b16f4ee961276e252ac7b WatchSource:0}: Error finding container 096479a1922d69082b1ef9b9e58ac118c8a36513f53b16f4ee961276e252ac7b: Status 404 returned error can't find the container with id 096479a1922d69082b1ef9b9e58ac118c8a36513f53b16f4ee961276e252ac7b Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.897580 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.898406 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:40.398380986 +0000 UTC m=+235.648552894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.999720 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:40 crc kubenswrapper[4764]: E0309 13:24:40.000090 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:40.50007578 +0000 UTC m=+235.750247708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.101129 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:40 crc kubenswrapper[4764]: E0309 13:24:40.101848 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:40.601833766 +0000 UTC m=+235.852005674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.202707 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:40 crc kubenswrapper[4764]: E0309 13:24:40.203093 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:40.703073838 +0000 UTC m=+235.953245746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.309149 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:40 crc kubenswrapper[4764]: E0309 13:24:40.309520 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:40.80950095 +0000 UTC m=+236.059672878 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.434337 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:40 crc kubenswrapper[4764]: E0309 13:24:40.434863 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:40.93485121 +0000 UTC m=+236.185023118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.485593 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" podStartSLOduration=170.485573094 podStartE2EDuration="2m50.485573094s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:40.42703911 +0000 UTC m=+235.677211028" watchObservedRunningTime="2026-03-09 13:24:40.485573094 +0000 UTC m=+235.735745002" Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.536067 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:40 crc kubenswrapper[4764]: E0309 13:24:40.538119 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:41.038100186 +0000 UTC m=+236.288272094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.639902 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:40 crc kubenswrapper[4764]: E0309 13:24:40.640210 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:41.140200191 +0000 UTC m=+236.390372099 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.749844 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:40 crc kubenswrapper[4764]: E0309 13:24:40.750566 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:41.250547598 +0000 UTC m=+236.500719516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.767472 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sp2mq" event={"ID":"ee2ad8bf-7cf9-4bab-9638-b26d9c593188","Type":"ContainerStarted","Data":"6406604b5450268574899e64582810b5f907ff3c88fc83afff57a81b94a53144"} Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.769534 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" podStartSLOduration=170.769507208 podStartE2EDuration="2m50.769507208s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:40.755033129 +0000 UTC m=+236.005205037" watchObservedRunningTime="2026-03-09 13:24:40.769507208 +0000 UTC m=+236.019679106" Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.772458 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.786533 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" event={"ID":"4912a02a-743d-4bbe-9063-7d99ccd3329a","Type":"ContainerStarted","Data":"5ba2f46b011491b44b263563cb6458b4ff96a75dc2fc530aa089b2237d16c9dc"} Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.830805 4764 patch_prober.go:28] interesting pod/console-operator-58897d9998-sp2mq container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.830868 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-sp2mq" podUID="ee2ad8bf-7cf9-4bab-9638-b26d9c593188" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.831064 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" event={"ID":"e025e897-5ff3-476b-81c9-afdd0ae7a25f","Type":"ContainerStarted","Data":"096479a1922d69082b1ef9b9e58ac118c8a36513f53b16f4ee961276e252ac7b"} Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.854389 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" podStartSLOduration=169.85437101 podStartE2EDuration="2m49.85437101s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:40.803926834 +0000 UTC m=+236.054098752" watchObservedRunningTime="2026-03-09 13:24:40.85437101 +0000 UTC m=+236.104542918" Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.855481 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" podStartSLOduration=170.8554762 podStartE2EDuration="2m50.8554762s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:40.854199745 +0000 UTC m=+236.104371653" watchObservedRunningTime="2026-03-09 13:24:40.8554762 +0000 UTC m=+236.105648118" Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.872131 4764 generic.go:334] "Generic (PLEG): container finished" podID="f95ed010-a6a4-49ab-b61b-fc4ee2d856bb" containerID="58c3f7cb0255d51f3d1c36ba3c3a9cc17e6a287531939e7bdc0a0ed62bdddec0" exitCode=0 Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.872456 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" event={"ID":"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb","Type":"ContainerDied","Data":"58c3f7cb0255d51f3d1c36ba3c3a9cc17e6a287531939e7bdc0a0ed62bdddec0"} Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.883475 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:40 crc kubenswrapper[4764]: E0309 13:24:40.891592 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:41.39156239 +0000 UTC m=+236.641734298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.900802 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" podStartSLOduration=170.900781128 podStartE2EDuration="2m50.900781128s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:40.900273504 +0000 UTC m=+236.150445442" watchObservedRunningTime="2026-03-09 13:24:40.900781128 +0000 UTC m=+236.150953036" Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.913120 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" event={"ID":"2968d8f2-48fa-470b-b90e-41bb83bd77c4","Type":"ContainerStarted","Data":"b271e38c47490532d225833222b6ca5e4a305ef1f97ca9ce9dcd15c0dd256421"} Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.964460 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" event={"ID":"ad307984-e46b-466b-8a5c-63a00976fbbf","Type":"ContainerStarted","Data":"cc1cf6f09872939d79376154021ce0b2d674d8e6ef7e53ddd1bd64b7e2f6ebe6"} Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.965880 4764 generic.go:334] "Generic (PLEG): container finished" podID="61b85db0-a292-42a8-8296-d0e476d80c89" containerID="f192a03e8cad6434074c3bf8b0d1239551233c346c1a5a801901b4e502485f85" exitCode=0 Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.965966 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" event={"ID":"61b85db0-a292-42a8-8296-d0e476d80c89","Type":"ContainerDied","Data":"f192a03e8cad6434074c3bf8b0d1239551233c346c1a5a801901b4e502485f85"} Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.967310 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" podStartSLOduration=170.967283806 podStartE2EDuration="2m50.967283806s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:40.963473243 +0000 UTC m=+236.213645151" watchObservedRunningTime="2026-03-09 13:24:40.967283806 +0000 UTC m=+236.217455714" Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.989199 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:40 crc kubenswrapper[4764]: E0309 13:24:40.990403 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:41.490388707 +0000 UTC m=+236.740560615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.004910 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-sp2mq" podStartSLOduration=171.004887507 podStartE2EDuration="2m51.004887507s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:41.003274413 +0000 UTC m=+236.253446321" watchObservedRunningTime="2026-03-09 13:24:41.004887507 +0000 UTC m=+236.255059415" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.014611 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551044-p748f" event={"ID":"0a005f65-920a-4cdd-b4da-a270953113aa","Type":"ContainerStarted","Data":"0090a60e7aca5b3b5065eab933849dd34829a2b082555fb9f3ff31c4a933640e"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.038470 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gnnbl" event={"ID":"b9c0d96b-ed96-4925-b890-8743879a8b38","Type":"ContainerStarted","Data":"dc0224d4c710f59d5ae4d9c8ebaf5e2202f8d17f96530cfbc95c1c94218d479e"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.061779 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-kdxg4" event={"ID":"8e5162b2-c722-4535-9adf-3af0eee24211","Type":"ContainerStarted","Data":"f37b9faf1705a2e566b1a64d30c1be15f3c6b2bdccb7212ed794a08b63dc64b0"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.061858 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-kdxg4" event={"ID":"8e5162b2-c722-4535-9adf-3af0eee24211","Type":"ContainerStarted","Data":"99260d16307bfad0173f86d572b75d711422195fab1bc9319bafc392c5caee3f"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.071044 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" event={"ID":"39da5087-79bc-4154-b340-22183d9e4417","Type":"ContainerStarted","Data":"c05318f0e67358bb015e1f76742485abff9f53469c740dc411704a1af2febb07"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.084732 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-svr2n"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.092852 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" podStartSLOduration=171.092827441 podStartE2EDuration="2m51.092827441s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:41.088768352 +0000 UTC m=+236.338940270" watchObservedRunningTime="2026-03-09 13:24:41.092827441 +0000 UTC m=+236.342999349" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.094027 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:41 crc kubenswrapper[4764]: E0309 13:24:41.094443 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:41.594428654 +0000 UTC m=+236.844600562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.111084 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" event={"ID":"23f87d2b-2a92-4abb-a2a6-2de508837343","Type":"ContainerStarted","Data":"9e34b8440c3bc84a76b19e43ca39644e8f0942f7bbfffcb2eb8834dd406ad88b"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.139108 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" event={"ID":"baee6113-40a8-468e-b343-09e9afd65ce3","Type":"ContainerStarted","Data":"ab46b2a08c4693388a36bd0c17784e529e27cba686dcd4cbbefb51c733f08617"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.165493 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" podStartSLOduration=171.165463274 podStartE2EDuration="2m51.165463274s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:41.144259504 +0000 UTC m=+236.394431422" watchObservedRunningTime="2026-03-09 13:24:41.165463274 +0000 UTC m=+236.415635182" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.167043 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.195922 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:41 crc kubenswrapper[4764]: E0309 13:24:41.199061 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:41.699031087 +0000 UTC m=+236.949202995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.221289 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.221763 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.229801 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-gnnbl" podStartSLOduration=171.229779093 podStartE2EDuration="2m51.229779093s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:41.217721369 +0000 UTC m=+236.467893277" watchObservedRunningTime="2026-03-09 13:24:41.229779093 +0000 UTC m=+236.479951011" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.242848 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8g9lj" event={"ID":"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d","Type":"ContainerStarted","Data":"472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.249419 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.266196 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" event={"ID":"f9b3244b-8df0-4330-9887-4092260d416a","Type":"ContainerStarted","Data":"5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.267415 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.274452 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.277143 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.279044 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-kdxg4" podStartSLOduration=5.279018277 podStartE2EDuration="5.279018277s" podCreationTimestamp="2026-03-09 13:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:41.268674699 +0000 UTC m=+236.518846607" watchObservedRunningTime="2026-03-09 13:24:41.279018277 +0000 UTC m=+236.529190175" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.282615 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.284713 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d4gwh"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.297405 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" event={"ID":"4125448d-5832-43c2-8dba-d95adde7458a","Type":"ContainerStarted","Data":"974a6a0feae12692003e99871ae67b10d7691df07927796cce4716a1be304c39"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.297533 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" event={"ID":"4125448d-5832-43c2-8dba-d95adde7458a","Type":"ContainerStarted","Data":"b690fd6d76bc82d1b0b1264df224431c765be00012875b5c40608249817b21f6"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.304055 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:41 crc kubenswrapper[4764]: E0309 13:24:41.304477 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:41.804460631 +0000 UTC m=+237.054632539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.316209 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.316288 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.319614 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.356635 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" podStartSLOduration=171.356610893 podStartE2EDuration="2m51.356610893s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:41.33677884 +0000 UTC m=+236.586950758" watchObservedRunningTime="2026-03-09 13:24:41.356610893 +0000 UTC m=+236.606782811" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.388685 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-8g9lj" podStartSLOduration=171.388671455 podStartE2EDuration="2m51.388671455s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:41.38809454 +0000 UTC m=+236.638266438" watchObservedRunningTime="2026-03-09 13:24:41.388671455 +0000 UTC m=+236.638843363" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.410870 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:41 crc kubenswrapper[4764]: E0309 13:24:41.412415 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:41.912399973 +0000 UTC m=+237.162571881 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.513254 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:41 crc kubenswrapper[4764]: E0309 13:24:41.513907 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.013894162 +0000 UTC m=+237.264066070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.518245 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" podStartSLOduration=171.518220379 podStartE2EDuration="2m51.518220379s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:41.454559717 +0000 UTC m=+236.704731625" watchObservedRunningTime="2026-03-09 13:24:41.518220379 +0000 UTC m=+236.768392287" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.519449 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" podStartSLOduration=171.519441941 podStartE2EDuration="2m51.519441941s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:41.517370486 +0000 UTC m=+236.767542414" watchObservedRunningTime="2026-03-09 13:24:41.519441941 +0000 UTC m=+236.769613849" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.553940 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.561883 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" podStartSLOduration=170.561839701 podStartE2EDuration="2m50.561839701s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:41.561841501 +0000 UTC m=+236.812013409" watchObservedRunningTime="2026-03-09 13:24:41.561839701 +0000 UTC m=+236.812011609" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.586809 4764 patch_prober.go:28] interesting pod/apiserver-76f77b778f-gst9d container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]log ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]etcd ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]poststarthook/generic-apiserver-start-informers ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]poststarthook/max-in-flight-filter ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 09 13:24:41 crc kubenswrapper[4764]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 09 13:24:41 crc kubenswrapper[4764]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 09 13:24:41 crc kubenswrapper[4764]: [+]poststarthook/project.openshift.io-projectcache ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]poststarthook/openshift.io-startinformers ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 09 13:24:41 crc kubenswrapper[4764]: livez check failed Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.586877 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" podUID="1ffb8d96-e6e4-4859-ae7d-37f900979485" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.619167 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:41 crc kubenswrapper[4764]: E0309 13:24:41.619512 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.119497142 +0000 UTC m=+237.369669040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.720832 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:41 crc kubenswrapper[4764]: E0309 13:24:41.721251 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.221235697 +0000 UTC m=+237.471407605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.730577 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hf98d"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.731041 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.731213 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2bqx9"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.731299 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.737850 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.740824 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6nhpx"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.745399 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qddjs"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.752687 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.785430 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-x927s"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.796046 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.803061 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.803235 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.821815 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:41 crc kubenswrapper[4764]: E0309 13:24:41.822178 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.32215105 +0000 UTC m=+237.572322958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.822270 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:41 crc kubenswrapper[4764]: E0309 13:24:41.824622 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.324607616 +0000 UTC m=+237.574779524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.892769 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k96kg"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.923208 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:41 crc kubenswrapper[4764]: E0309 13:24:41.923548 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.423533836 +0000 UTC m=+237.673705744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: W0309 13:24:41.997101 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7bf830c_e91a_4dd1_a3b5_64ca95d57e44.slice/crio-51e42dcb16123e5cf2fd2b0d634046ecad5b3c0788df6e54acad60f0073d6310 WatchSource:0}: Error finding container 51e42dcb16123e5cf2fd2b0d634046ecad5b3c0788df6e54acad60f0073d6310: Status 404 returned error can't find the container with id 51e42dcb16123e5cf2fd2b0d634046ecad5b3c0788df6e54acad60f0073d6310 Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.043466 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.043765 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.543754198 +0000 UTC m=+237.793926106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.144266 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.144461 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.644428975 +0000 UTC m=+237.894600893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.144579 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.144949 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.644933859 +0000 UTC m=+237.895105767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.246742 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.246841 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.746822257 +0000 UTC m=+237.996994165 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.247274 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.247609 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.747602048 +0000 UTC m=+237.997773956 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.257323 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tgqwl"] Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.265013 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.327486 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" event={"ID":"2968d8f2-48fa-470b-b90e-41bb83bd77c4","Type":"ContainerStarted","Data":"3c841173303d5e946e28133a9f7209f8258a96e0d394cc6cb05219cd1ad5cd80"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.327544 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" event={"ID":"2968d8f2-48fa-470b-b90e-41bb83bd77c4","Type":"ContainerStarted","Data":"e882a241a6b64098b307bd65302f593b109c7616d0dfcd5af3432be62bac1490"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.329181 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" event={"ID":"4ba55602-0e3f-4722-b437-546732351bc4","Type":"ContainerStarted","Data":"ce0e28aa70d3b1872a1159ed0a983773af51f49c62d7d29ccea3a54c85809714"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.329201 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" event={"ID":"4ba55602-0e3f-4722-b437-546732351bc4","Type":"ContainerStarted","Data":"ce323e500dd6968c2c9c8d88fa53de8fd906e1738bf2fd3a70a4cc1dbd027d43"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.330462 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" event={"ID":"711447e6-e7cf-4577-8050-b5a391f96f6a","Type":"ContainerStarted","Data":"d1aeac6ab405adbedcfbf26d14673b870c7935fc9017a2f997ff017a10884246"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.332230 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" event={"ID":"36228b47-79c7-484d-9753-6f36806aa344","Type":"ContainerStarted","Data":"2a85ff48150941711bc5dc6f6e1946e85fc51e5e1ee8355d96efd19f94408a1b"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.347814 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.348293 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.848278355 +0000 UTC m=+238.098450263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.356824 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" event={"ID":"e025e897-5ff3-476b-81c9-afdd0ae7a25f","Type":"ContainerStarted","Data":"683a86869b3cf186005a2d8d66a9152062ee1aa672b9fadbcc0e02bd95576ecb"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.421483 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" event={"ID":"39249ace-8681-491c-a547-869e72d297a7","Type":"ContainerStarted","Data":"e130962f3c9e2e9cc9876329a3337db4ebea068c866a0f3f7b18814bee37cbac"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.434862 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" event={"ID":"d0959a00-2a83-457f-bcba-7d4af48b11c3","Type":"ContainerStarted","Data":"46c216dd80fead93f3b79f0eb2a4d786abd149af408a05f24bc985c61274f03d"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.439546 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" event={"ID":"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb","Type":"ContainerStarted","Data":"4f9046706a3f827f5fd453c79a14e6921bf3de7ff8fea0a8fcd6339057fbf22a"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.447960 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" event={"ID":"39da5087-79bc-4154-b340-22183d9e4417","Type":"ContainerStarted","Data":"45383c97959a17ffdeaa9f0ab6e8a1110b113c90d84de0fa490663169c04fa26"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.449023 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.450960 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.950948325 +0000 UTC m=+238.201120233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.493933 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k96kg" event={"ID":"b7bf830c-e91a-4dd1-a3b5-64ca95d57e44","Type":"ContainerStarted","Data":"51e42dcb16123e5cf2fd2b0d634046ecad5b3c0788df6e54acad60f0073d6310"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.494719 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r"] Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.520604 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" podStartSLOduration=172.520582147 podStartE2EDuration="2m52.520582147s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:42.494838185 +0000 UTC m=+237.745010093" watchObservedRunningTime="2026-03-09 13:24:42.520582147 +0000 UTC m=+237.770754055" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.530951 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" event={"ID":"1ccc5b44-95ad-4f4c-8086-c176c41bbd19","Type":"ContainerStarted","Data":"771cd63965fde5f5f03cba604e9f4e1989cf6a4881a27fbd710be5727898d90a"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.545827 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" podStartSLOduration=171.545798206 podStartE2EDuration="2m51.545798206s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:42.542353013 +0000 UTC m=+237.792524921" watchObservedRunningTime="2026-03-09 13:24:42.545798206 +0000 UTC m=+237.795970114" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.546279 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" event={"ID":"1fe7eff7-ce11-4d46-bf25-06162522c1ff","Type":"ContainerStarted","Data":"8e8672c3608b3ac01b268f2222a4a8cd4ad621c1058e37144ca6a9c8fec3428d"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.546339 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" event={"ID":"1fe7eff7-ce11-4d46-bf25-06162522c1ff","Type":"ContainerStarted","Data":"b76cc8470833290a8cdc0af534890facef6330d24592cf5cab7eff90a97b01b4"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.568803 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.569901 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.069885673 +0000 UTC m=+238.320057581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.586726 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" event={"ID":"4912a02a-743d-4bbe-9063-7d99ccd3329a","Type":"ContainerStarted","Data":"bd09da2955adc131fc7c3ddd3df7a6b960386a79e7cc0b7bb14222b804484b3c"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.586770 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" event={"ID":"4912a02a-743d-4bbe-9063-7d99ccd3329a","Type":"ContainerStarted","Data":"9ba7188c7b70176f3445c4dce5f61edb07ec68251679c54544de9639e45e6110"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.597225 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" event={"ID":"30a07c97-9d99-41be-956e-ba3d6505d318","Type":"ContainerStarted","Data":"52ff6a9b11622efde32cfd2c095ae29f67209b68a6062d6f9f3d39a5e5954bcb"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.612332 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" event={"ID":"14b4aa8c-1066-4388-9442-07722e4c76c2","Type":"ContainerStarted","Data":"6e0609643514d25b899aa0fec5020ebe1e5f97db34a9c6358382637eb561b8bf"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.613623 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6nhpx" event={"ID":"0c464093-f30e-4c79-84cd-98a6d49b813c","Type":"ContainerStarted","Data":"6fda0449cd3de2ac61b5306b39c5053b3087e83da3f97ebe92a6989af5624216"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.629068 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" podStartSLOduration=171.629050654 podStartE2EDuration="2m51.629050654s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:42.626733391 +0000 UTC m=+237.876905309" watchObservedRunningTime="2026-03-09 13:24:42.629050654 +0000 UTC m=+237.879222562" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.630785 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" event={"ID":"f4f31f9d-d2da-4e3d-b002-2f477d344fc0","Type":"ContainerStarted","Data":"018ed989a89604caa549e9933a3e4afacd7cdbe9d8fb62d4a234b3d49348aead"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.666770 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" event={"ID":"61b85db0-a292-42a8-8296-d0e476d80c89","Type":"ContainerStarted","Data":"dc18497145249da5d49ff9ecb83795ba7f30271084e17b522ebdc49b11a91a79"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.667401 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.670619 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.688263 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.188233235 +0000 UTC m=+238.438405143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.688432 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" event={"ID":"db0ec273-54d8-4753-b519-243b727a9efd","Type":"ContainerStarted","Data":"85384ed843ba7525776bc0e37762a1506fee5aed8c8b4de5a63a312c01aa5397"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.708554 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf" event={"ID":"2c795327-bff9-453d-b1b2-d48a2ef2b48f","Type":"ContainerStarted","Data":"0e8d6fc6921c311c06fdbd8c2b577d743a2f36ee5852e92052d728ca4e7e7b9e"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.709577 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" event={"ID":"e56f29e6-292c-48d2-a5d7-531cfa6689f5","Type":"ContainerStarted","Data":"4b71e28be35b18ba52af2a6b9e7dfe5e124ed335ec95f1aabd4ac0cb31e12461"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.707867 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" podStartSLOduration=171.707843802 podStartE2EDuration="2m51.707843802s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:42.706778294 +0000 UTC m=+237.956950202" watchObservedRunningTime="2026-03-09 13:24:42.707843802 +0000 UTC m=+237.958015710" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.712934 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" podStartSLOduration=171.712914439 podStartE2EDuration="2m51.712914439s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:42.666280325 +0000 UTC m=+237.916452233" watchObservedRunningTime="2026-03-09 13:24:42.712914439 +0000 UTC m=+237.963086357" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.755902 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" podStartSLOduration=172.755881114 podStartE2EDuration="2m52.755881114s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:42.755248807 +0000 UTC m=+238.005420735" watchObservedRunningTime="2026-03-09 13:24:42.755881114 +0000 UTC m=+238.006053022" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.769336 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" event={"ID":"957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4","Type":"ContainerStarted","Data":"fb969b9bf90fa52524547e416ef64f049b665c02a0cee1fda383ad6afb853030"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.788472 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.789316 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.289263301 +0000 UTC m=+238.539435209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.802939 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:42 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:42 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:42 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.803068 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.823993 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-x927s" event={"ID":"d245b116-e47d-4b15-a40d-0a9fa34cf1df","Type":"ContainerStarted","Data":"fb8983fcce0c65bb236e4bf6391a7232453b84690f04abd146b7159e644824b5"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.837131 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" event={"ID":"24df0ad8-c9b4-46b6-8751-23e26fc391c5","Type":"ContainerStarted","Data":"aa04e64eb67113f09ca7b1961f1e7e1abe40f9ad1e80aa1a5c846db9de1e02ba"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.847683 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" event={"ID":"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497","Type":"ContainerStarted","Data":"bc0446f3c4009aefe59e54fcf22af3e8b6492a5f7a324e98c3ceaef95c9107b9"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.847998 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" event={"ID":"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497","Type":"ContainerStarted","Data":"acc1adb0f9a7d2151a697693548447578320be7a5c1a7959581039acf6e2722f"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.856466 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" event={"ID":"ad307984-e46b-466b-8a5c-63a00976fbbf","Type":"ContainerStarted","Data":"6ec0ff7f4f3afbfa3ba79233bf901cbdeab145e0b36f5500af8fdfe0fae981f4"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.856509 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" event={"ID":"ad307984-e46b-466b-8a5c-63a00976fbbf","Type":"ContainerStarted","Data":"a77f85524f39dfdce8514bbbba00b0efcc11abc1b64d546d6fe4425c458d4b10"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.866185 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" event={"ID":"b72bd4db-e5ea-44f6-bdce-81df2966acfb","Type":"ContainerStarted","Data":"b9c2d68b1b05bc39c19dc9c0d1b5c51309a4b0fc69bc58ae3f2b3b3980758063"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.868859 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.893056 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" podStartSLOduration=171.893037092 podStartE2EDuration="2m51.893037092s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:42.883564507 +0000 UTC m=+238.133736425" watchObservedRunningTime="2026-03-09 13:24:42.893037092 +0000 UTC m=+238.143209000" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.893441 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.907667 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.407651525 +0000 UTC m=+238.657823433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.943086 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.960194 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.997755 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" podStartSLOduration=171.997738907 podStartE2EDuration="2m51.997738907s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:42.997043388 +0000 UTC m=+238.247215296" watchObservedRunningTime="2026-03-09 13:24:42.997738907 +0000 UTC m=+238.247910815" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.999937 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.000111 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.50008812 +0000 UTC m=+238.750260038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.000734 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.001067 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.501054216 +0000 UTC m=+238.751226124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.105801 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.106010 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.605980607 +0000 UTC m=+238.856152515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.106473 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.106828 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.606815959 +0000 UTC m=+238.856987857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.124240 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" podStartSLOduration=173.124223257 podStartE2EDuration="2m53.124223257s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:43.032909242 +0000 UTC m=+238.283081150" watchObservedRunningTime="2026-03-09 13:24:43.124223257 +0000 UTC m=+238.374395165" Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.208485 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.208955 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.708941455 +0000 UTC m=+238.959113363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.242503 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.242928 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.287314 4764 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-mp9p7 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.32:8443/livez\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.287391 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" podUID="f95ed010-a6a4-49ab-b61b-fc4ee2d856bb" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.32:8443/livez\": dial tcp 10.217.0.32:8443: connect: connection refused" Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.309842 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.310195 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.810181247 +0000 UTC m=+239.060353155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.411637 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.412094 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.912077747 +0000 UTC m=+239.162249655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.416891 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.417433 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.91741677 +0000 UTC m=+239.167588678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.518256 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.518695 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:44.018660843 +0000 UTC m=+239.268832751 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.623480 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.624203 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:44.12418878 +0000 UTC m=+239.374360688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.725005 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.725465 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:44.225448593 +0000 UTC m=+239.475620501 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.814405 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:43 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:43 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:43 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.814448 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.827317 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.827652 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:44.32763511 +0000 UTC m=+239.577807018 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.898064 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k96kg" event={"ID":"b7bf830c-e91a-4dd1-a3b5-64ca95d57e44","Type":"ContainerStarted","Data":"5a4c9d2a72e910ac851b9d01334a9a52e320d29fcd19cbd0b7048b2d972bdb8b"} Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.914147 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" event={"ID":"39249ace-8681-491c-a547-869e72d297a7","Type":"ContainerStarted","Data":"9dbfdb319cfbda4d2363de45293073dc4f68963073184760bddb42b83b7fd959"} Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.928461 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.929140 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:44.429109028 +0000 UTC m=+239.679280936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.949683 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" event={"ID":"14b4aa8c-1066-4388-9442-07722e4c76c2","Type":"ContainerStarted","Data":"239a4bfbac809dccd0d77ee5ebd983674866554d54f688f9884e5532b5af4bf3"} Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.950926 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.952290 4764 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-m4bs6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" start-of-body= Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.952332 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" podUID="14b4aa8c-1066-4388-9442-07722e4c76c2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.002047 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" podStartSLOduration=173.002030439 podStartE2EDuration="2m53.002030439s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:44.000685593 +0000 UTC m=+239.250857511" watchObservedRunningTime="2026-03-09 13:24:44.002030439 +0000 UTC m=+239.252202347" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.003189 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-k96kg" podStartSLOduration=9.00318289 podStartE2EDuration="9.00318289s" podCreationTimestamp="2026-03-09 13:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:43.945750686 +0000 UTC m=+239.195922594" watchObservedRunningTime="2026-03-09 13:24:44.00318289 +0000 UTC m=+239.253354798" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.008911 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf" event={"ID":"2c795327-bff9-453d-b1b2-d48a2ef2b48f","Type":"ContainerStarted","Data":"0092591463bed38f0d9307f831756d3403c2c467907ed9ed4adf7df61b3e8e83"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.037991 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:44 crc kubenswrapper[4764]: E0309 13:24:44.039764 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:44.539747823 +0000 UTC m=+239.789919731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.042336 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" event={"ID":"957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4","Type":"ContainerStarted","Data":"0247f0ecbef2c5254057c30c71649d8fc73c61d8130087fef46f695974b0c5d0"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.083310 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" event={"ID":"30a07c97-9d99-41be-956e-ba3d6505d318","Type":"ContainerStarted","Data":"9ab3fcffeccd0977d687bad1b407d063e8a5dbd95b934ebdbc0879da24a9d363"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.084388 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.091565 4764 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-ptbpd container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.091614 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" podUID="30a07c97-9d99-41be-956e-ba3d6505d318" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.129750 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" event={"ID":"1ccc5b44-95ad-4f4c-8086-c176c41bbd19","Type":"ContainerStarted","Data":"2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.129810 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.141196 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:44 crc kubenswrapper[4764]: E0309 13:24:44.141871 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:44.641855198 +0000 UTC m=+239.892027106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.142661 4764 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-d4gwh container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.142718 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" podUID="1ccc5b44-95ad-4f4c-8086-c176c41bbd19" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.147294 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" event={"ID":"24df0ad8-c9b4-46b6-8751-23e26fc391c5","Type":"ContainerStarted","Data":"c0a5e0c4e983568b143fbc0bc2bb99af436b4e6317ddf3b8ba9ae7a67dfad358"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.150395 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" podStartSLOduration=173.150382698 podStartE2EDuration="2m53.150382698s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:44.038487849 +0000 UTC m=+239.288659757" watchObservedRunningTime="2026-03-09 13:24:44.150382698 +0000 UTC m=+239.400554606" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.151844 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" podStartSLOduration=173.151837497 podStartE2EDuration="2m53.151837497s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:44.146654637 +0000 UTC m=+239.396826555" watchObservedRunningTime="2026-03-09 13:24:44.151837497 +0000 UTC m=+239.402009405" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.194316 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" event={"ID":"711447e6-e7cf-4577-8050-b5a391f96f6a","Type":"ContainerStarted","Data":"cb07cbdd389629d8ee18f07cc12b6c2c3505d455786f0a1b1e67b9ed8a5d4b78"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.220404 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" podStartSLOduration=173.22038963 podStartE2EDuration="2m53.22038963s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:44.218686534 +0000 UTC m=+239.468858442" watchObservedRunningTime="2026-03-09 13:24:44.22038963 +0000 UTC m=+239.470561538" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.220834 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" podStartSLOduration=173.220826212 podStartE2EDuration="2m53.220826212s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:44.178091893 +0000 UTC m=+239.428263811" watchObservedRunningTime="2026-03-09 13:24:44.220826212 +0000 UTC m=+239.470998110" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.243215 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:44 crc kubenswrapper[4764]: E0309 13:24:44.244650 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:44.744631922 +0000 UTC m=+239.994803830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.260078 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" podStartSLOduration=173.260061117 podStartE2EDuration="2m53.260061117s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:44.257516828 +0000 UTC m=+239.507688736" watchObservedRunningTime="2026-03-09 13:24:44.260061117 +0000 UTC m=+239.510233025" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.286457 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-x927s" event={"ID":"d245b116-e47d-4b15-a40d-0a9fa34cf1df","Type":"ContainerStarted","Data":"4ed619baf68f65746ed98427321c72cd4adc6131be0c5ab50bca4ff635aa848c"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.286528 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-x927s" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.311403 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" event={"ID":"d0959a00-2a83-457f-bcba-7d4af48b11c3","Type":"ContainerStarted","Data":"650632f786d1cc04ce84b188d86dd1f51067920d278adb844500f8bb3c442e34"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.320626 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.320710 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.323872 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-x927s" podStartSLOduration=174.323853342 podStartE2EDuration="2m54.323853342s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:44.312010103 +0000 UTC m=+239.562182011" watchObservedRunningTime="2026-03-09 13:24:44.323853342 +0000 UTC m=+239.574025260" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.344727 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:44 crc kubenswrapper[4764]: E0309 13:24:44.345441 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:44.845387111 +0000 UTC m=+240.095559019 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.347333 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" event={"ID":"f4f31f9d-d2da-4e3d-b002-2f477d344fc0","Type":"ContainerStarted","Data":"4b48ee04fb4d9cd72ec995aff59d66de6aa64b9977090159021e71e6e46f1c0a"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.347374 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" event={"ID":"f4f31f9d-d2da-4e3d-b002-2f477d344fc0","Type":"ContainerStarted","Data":"228541715f137f4256dfda489265aa94898f79ab8cc9bbc96f152ad8f9b36b00"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.370471 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" event={"ID":"db0ec273-54d8-4753-b519-243b727a9efd","Type":"ContainerStarted","Data":"297d9673d6c5a5a84640d3d03df6d12bc6a8bee1a2f228d1b78cc69911a31952"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.370526 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" event={"ID":"db0ec273-54d8-4753-b519-243b727a9efd","Type":"ContainerStarted","Data":"103dbe40f99ec961f84f53a9e285ee4190ff7195b2a42f0029837297da92234c"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.370542 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.372120 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" podUID="b625331d-48ab-4d48-86fd-fe73466305ff" containerName="controller-manager" containerID="cri-o://878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d" gracePeriod=30 Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.373673 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" podUID="ad49b4d8-1218-4a34-8455-831d0f563cbf" containerName="route-controller-manager" containerID="cri-o://87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c" gracePeriod=30 Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.386765 4764 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fxv7j container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.386817 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" podUID="61b85db0-a292-42a8-8296-d0e476d80c89" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.396180 4764 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fxv7j container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.396224 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" podUID="61b85db0-a292-42a8-8296-d0e476d80c89" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.396656 4764 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fxv7j container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.396688 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" podUID="61b85db0-a292-42a8-8296-d0e476d80c89" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.410976 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" podStartSLOduration=173.410959464 podStartE2EDuration="2m53.410959464s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:44.371817991 +0000 UTC m=+239.621989899" watchObservedRunningTime="2026-03-09 13:24:44.410959464 +0000 UTC m=+239.661131372" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.411071 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" podStartSLOduration=173.411066727 podStartE2EDuration="2m53.411066727s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:44.410540832 +0000 UTC m=+239.660712740" watchObservedRunningTime="2026-03-09 13:24:44.411066727 +0000 UTC m=+239.661238645" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.447903 4764 ???:1] "http: TLS handshake error from 192.168.126.11:48390: no serving certificate available for the kubelet" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.457215 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:44 crc kubenswrapper[4764]: E0309 13:24:44.458339 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:44.958325287 +0000 UTC m=+240.208497285 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.493267 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" podStartSLOduration=173.493253516 podStartE2EDuration="2m53.493253516s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:44.486737101 +0000 UTC m=+239.736909009" watchObservedRunningTime="2026-03-09 13:24:44.493253516 +0000 UTC m=+239.743425424" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.507293 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nrc8s"] Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.514152 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.517020 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.517633 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nrc8s"] Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.536990 4764 ???:1] "http: TLS handshake error from 192.168.126.11:48400: no serving certificate available for the kubelet" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.563960 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.564107 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-utilities\") pod \"community-operators-nrc8s\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.564203 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff292\" (UniqueName: \"kubernetes.io/projected/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-kube-api-access-ff292\") pod \"community-operators-nrc8s\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.564221 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-catalog-content\") pod \"community-operators-nrc8s\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: E0309 13:24:44.564309 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.064295046 +0000 UTC m=+240.314466954 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.622982 4764 ???:1] "http: TLS handshake error from 192.168.126.11:48414: no serving certificate available for the kubelet" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.665630 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-utilities\") pod \"community-operators-nrc8s\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.666024 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.666119 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff292\" (UniqueName: \"kubernetes.io/projected/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-kube-api-access-ff292\") pod \"community-operators-nrc8s\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.666147 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-catalog-content\") pod \"community-operators-nrc8s\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.666942 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-catalog-content\") pod \"community-operators-nrc8s\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.667167 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-utilities\") pod \"community-operators-nrc8s\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: E0309 13:24:44.667451 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.16743924 +0000 UTC m=+240.417611148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.700551 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8d627"] Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.702127 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.705345 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff292\" (UniqueName: \"kubernetes.io/projected/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-kube-api-access-ff292\") pod \"community-operators-nrc8s\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.709711 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.712937 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8d627"] Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.725002 4764 ???:1] "http: TLS handshake error from 192.168.126.11:48430: no serving certificate available for the kubelet" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.769139 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.769421 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-catalog-content\") pod \"certified-operators-8d627\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.769530 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fsvt\" (UniqueName: \"kubernetes.io/projected/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-kube-api-access-5fsvt\") pod \"certified-operators-8d627\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.769560 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-utilities\") pod \"certified-operators-8d627\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:44 crc kubenswrapper[4764]: E0309 13:24:44.769696 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.269661828 +0000 UTC m=+240.519833736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.793906 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:44 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:44 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:44 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.793956 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.817958 4764 ???:1] "http: TLS handshake error from 192.168.126.11:48446: no serving certificate available for the kubelet" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.870508 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-catalog-content\") pod \"certified-operators-8d627\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.870955 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-catalog-content\") pod \"certified-operators-8d627\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.871057 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.871509 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fsvt\" (UniqueName: \"kubernetes.io/projected/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-kube-api-access-5fsvt\") pod \"certified-operators-8d627\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.871540 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-utilities\") pod \"certified-operators-8d627\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:44 crc kubenswrapper[4764]: E0309 13:24:44.871763 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.371718722 +0000 UTC m=+240.621890630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.872071 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-utilities\") pod \"certified-operators-8d627\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.912586 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mbm5b"] Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.914247 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.917853 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.918199 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mbm5b"] Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.923103 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fsvt\" (UniqueName: \"kubernetes.io/projected/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-kube-api-access-5fsvt\") pod \"certified-operators-8d627\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.948177 4764 ???:1] "http: TLS handshake error from 192.168.126.11:48452: no serving certificate available for the kubelet" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.974084 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.974304 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-catalog-content\") pod \"community-operators-mbm5b\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.974363 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4s9m\" (UniqueName: \"kubernetes.io/projected/20acdcb5-ea78-435e-b472-e102d5553c75-kube-api-access-f4s9m\") pod \"community-operators-mbm5b\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.974387 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-utilities\") pod \"community-operators-mbm5b\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:44 crc kubenswrapper[4764]: E0309 13:24:44.974522 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.474507035 +0000 UTC m=+240.724678943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.047033 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.052336 4764 ???:1] "http: TLS handshake error from 192.168.126.11:48460: no serving certificate available for the kubelet" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.080617 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.080692 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-catalog-content\") pod \"community-operators-mbm5b\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.080724 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4s9m\" (UniqueName: \"kubernetes.io/projected/20acdcb5-ea78-435e-b472-e102d5553c75-kube-api-access-f4s9m\") pod \"community-operators-mbm5b\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.080741 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-utilities\") pod \"community-operators-mbm5b\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.081537 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-utilities\") pod \"community-operators-mbm5b\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.081784 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-catalog-content\") pod \"community-operators-mbm5b\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.082012 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.581999126 +0000 UTC m=+240.832171034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.121787 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jn8f5"] Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.122919 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.127271 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jn8f5"] Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.150006 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4s9m\" (UniqueName: \"kubernetes.io/projected/20acdcb5-ea78-435e-b472-e102d5553c75-kube-api-access-f4s9m\") pod \"community-operators-mbm5b\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.161637 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.182032 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.182228 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.68220007 +0000 UTC m=+240.932371988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.182653 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.182704 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-utilities\") pod \"certified-operators-jn8f5\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.182759 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-catalog-content\") pod \"certified-operators-jn8f5\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.182777 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc5m2\" (UniqueName: \"kubernetes.io/projected/a76121be-d090-4f2a-9e57-1a160a4bb4f2-kube-api-access-dc5m2\") pod \"certified-operators-jn8f5\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.183050 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.683038072 +0000 UTC m=+240.933209980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.203613 4764 ???:1] "http: TLS handshake error from 192.168.126.11:48476: no serving certificate available for the kubelet" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.213823 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d896c677f-8g9dz"] Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.214023 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b625331d-48ab-4d48-86fd-fe73466305ff" containerName="controller-manager" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.214033 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b625331d-48ab-4d48-86fd-fe73466305ff" containerName="controller-manager" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.214117 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b625331d-48ab-4d48-86fd-fe73466305ff" containerName="controller-manager" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.214456 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.241505 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d896c677f-8g9dz"] Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.273161 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284044 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-client-ca\") pod \"b625331d-48ab-4d48-86fd-fe73466305ff\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284129 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-config\") pod \"b625331d-48ab-4d48-86fd-fe73466305ff\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284148 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b625331d-48ab-4d48-86fd-fe73466305ff-serving-cert\") pod \"b625331d-48ab-4d48-86fd-fe73466305ff\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284181 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-proxy-ca-bundles\") pod \"b625331d-48ab-4d48-86fd-fe73466305ff\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284212 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p79zf\" (UniqueName: \"kubernetes.io/projected/b625331d-48ab-4d48-86fd-fe73466305ff-kube-api-access-p79zf\") pod \"b625331d-48ab-4d48-86fd-fe73466305ff\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284351 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284503 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp2v7\" (UniqueName: \"kubernetes.io/projected/ef84d4f2-b722-415f-bc23-d472e00474b4-kube-api-access-jp2v7\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284539 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-catalog-content\") pod \"certified-operators-jn8f5\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284560 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc5m2\" (UniqueName: \"kubernetes.io/projected/a76121be-d090-4f2a-9e57-1a160a4bb4f2-kube-api-access-dc5m2\") pod \"certified-operators-jn8f5\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284585 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-config\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284616 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-proxy-ca-bundles\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284682 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-client-ca\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284701 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-utilities\") pod \"certified-operators-jn8f5\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284719 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef84d4f2-b722-415f-bc23-d472e00474b4-serving-cert\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.285471 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-client-ca" (OuterVolumeSpecName: "client-ca") pod "b625331d-48ab-4d48-86fd-fe73466305ff" (UID: "b625331d-48ab-4d48-86fd-fe73466305ff"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.286031 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-catalog-content\") pod \"certified-operators-jn8f5\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.290954 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b625331d-48ab-4d48-86fd-fe73466305ff" (UID: "b625331d-48ab-4d48-86fd-fe73466305ff"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.291071 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.791047216 +0000 UTC m=+241.041219744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.291166 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-utilities\") pod \"certified-operators-jn8f5\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.306486 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b625331d-48ab-4d48-86fd-fe73466305ff-kube-api-access-p79zf" (OuterVolumeSpecName: "kube-api-access-p79zf") pod "b625331d-48ab-4d48-86fd-fe73466305ff" (UID: "b625331d-48ab-4d48-86fd-fe73466305ff"). InnerVolumeSpecName "kube-api-access-p79zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.310963 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-config" (OuterVolumeSpecName: "config") pod "b625331d-48ab-4d48-86fd-fe73466305ff" (UID: "b625331d-48ab-4d48-86fd-fe73466305ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.312239 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b625331d-48ab-4d48-86fd-fe73466305ff-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b625331d-48ab-4d48-86fd-fe73466305ff" (UID: "b625331d-48ab-4d48-86fd-fe73466305ff"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.327529 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc5m2\" (UniqueName: \"kubernetes.io/projected/a76121be-d090-4f2a-9e57-1a160a4bb4f2-kube-api-access-dc5m2\") pod \"certified-operators-jn8f5\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.387376 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-config\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.387425 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-proxy-ca-bundles\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.387502 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.387538 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-client-ca\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.387566 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef84d4f2-b722-415f-bc23-d472e00474b4-serving-cert\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.387618 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp2v7\" (UniqueName: \"kubernetes.io/projected/ef84d4f2-b722-415f-bc23-d472e00474b4-kube-api-access-jp2v7\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.387638 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.387948 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.887932501 +0000 UTC m=+241.138104409 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.389819 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.389830 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b625331d-48ab-4d48-86fd-fe73466305ff-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.389839 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.389847 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.389856 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p79zf\" (UniqueName: \"kubernetes.io/projected/b625331d-48ab-4d48-86fd-fe73466305ff-kube-api-access-p79zf\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.388953 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-client-ca\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.392045 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef84d4f2-b722-415f-bc23-d472e00474b4-serving-cert\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.395232 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-proxy-ca-bundles\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.398041 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-config\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.420785 4764 generic.go:334] "Generic (PLEG): container finished" podID="ad49b4d8-1218-4a34-8455-831d0f563cbf" containerID="87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c" exitCode=0 Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.421014 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" event={"ID":"ad49b4d8-1218-4a34-8455-831d0f563cbf","Type":"ContainerDied","Data":"87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.421273 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" event={"ID":"ad49b4d8-1218-4a34-8455-831d0f563cbf","Type":"ContainerDied","Data":"3ebce9784954b6d704d045dc1cbbc83b452607e4967b3e3637a00f82a3e69729"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.421299 4764 scope.go:117] "RemoveContainer" containerID="87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.426409 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp2v7\" (UniqueName: \"kubernetes.io/projected/ef84d4f2-b722-415f-bc23-d472e00474b4-kube-api-access-jp2v7\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.481489 4764 generic.go:334] "Generic (PLEG): container finished" podID="b625331d-48ab-4d48-86fd-fe73466305ff" containerID="878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d" exitCode=0 Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.481696 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.482294 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" event={"ID":"b625331d-48ab-4d48-86fd-fe73466305ff","Type":"ContainerDied","Data":"878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.482341 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" event={"ID":"b625331d-48ab-4d48-86fd-fe73466305ff","Type":"ContainerDied","Data":"22fd4fd8035fc57032bbb13916cd7b7736a6fafcf098f02294f9432a8954ab17"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.490295 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" event={"ID":"36228b47-79c7-484d-9753-6f36806aa344","Type":"ContainerStarted","Data":"692e2aeac6cc8771f4971ec37256a0c666f20ffafaa1026487679c570aa94d67"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.491588 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48tpb\" (UniqueName: \"kubernetes.io/projected/ad49b4d8-1218-4a34-8455-831d0f563cbf-kube-api-access-48tpb\") pod \"ad49b4d8-1218-4a34-8455-831d0f563cbf\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.491678 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-config\") pod \"ad49b4d8-1218-4a34-8455-831d0f563cbf\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.491860 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.491916 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-client-ca\") pod \"ad49b4d8-1218-4a34-8455-831d0f563cbf\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.491978 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad49b4d8-1218-4a34-8455-831d0f563cbf-serving-cert\") pod \"ad49b4d8-1218-4a34-8455-831d0f563cbf\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.492718 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.992691208 +0000 UTC m=+241.242863116 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.493580 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-config" (OuterVolumeSpecName: "config") pod "ad49b4d8-1218-4a34-8455-831d0f563cbf" (UID: "ad49b4d8-1218-4a34-8455-831d0f563cbf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.494222 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-client-ca" (OuterVolumeSpecName: "client-ca") pod "ad49b4d8-1218-4a34-8455-831d0f563cbf" (UID: "ad49b4d8-1218-4a34-8455-831d0f563cbf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.494406 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad49b4d8-1218-4a34-8455-831d0f563cbf-kube-api-access-48tpb" (OuterVolumeSpecName: "kube-api-access-48tpb") pod "ad49b4d8-1218-4a34-8455-831d0f563cbf" (UID: "ad49b4d8-1218-4a34-8455-831d0f563cbf"). InnerVolumeSpecName "kube-api-access-48tpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.495945 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.502616 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad49b4d8-1218-4a34-8455-831d0f563cbf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ad49b4d8-1218-4a34-8455-831d0f563cbf" (UID: "ad49b4d8-1218-4a34-8455-831d0f563cbf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.523381 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf" event={"ID":"2c795327-bff9-453d-b1b2-d48a2ef2b48f","Type":"ContainerStarted","Data":"046d9711018c15dc8f5ae3ada6a1affa5fbeb6686e7e7b8ab7b0558e72873174"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.569847 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tgqwl"] Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.582637 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf" podStartSLOduration=174.582619335 podStartE2EDuration="2m54.582619335s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:45.570399757 +0000 UTC m=+240.820571665" watchObservedRunningTime="2026-03-09 13:24:45.582619335 +0000 UTC m=+240.832791243" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.594720 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.594819 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad49b4d8-1218-4a34-8455-831d0f563cbf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.594842 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48tpb\" (UniqueName: \"kubernetes.io/projected/ad49b4d8-1218-4a34-8455-831d0f563cbf-kube-api-access-48tpb\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.594858 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.594869 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.600624 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tgqwl"] Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.604474 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.616160 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:46.116121806 +0000 UTC m=+241.366293714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.642506 4764 scope.go:117] "RemoveContainer" containerID="87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c" Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.660508 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c\": container with ID starting with 87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c not found: ID does not exist" containerID="87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.660562 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c"} err="failed to get container status \"87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c\": rpc error: code = NotFound desc = could not find container \"87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c\": container with ID starting with 87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c not found: ID does not exist" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.660588 4764 scope.go:117] "RemoveContainer" containerID="878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.697238 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.698193 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:46.198176042 +0000 UTC m=+241.448347940 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.732053 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b625331d-48ab-4d48-86fd-fe73466305ff" path="/var/lib/kubelet/pods/b625331d-48ab-4d48-86fd-fe73466305ff/volumes" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.732624 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" event={"ID":"e56f29e6-292c-48d2-a5d7-531cfa6689f5","Type":"ContainerStarted","Data":"cb937b57e16e88d19c62c3e17a738b76cd16d7fabca3520c97db154048373031"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.732716 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" event={"ID":"957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4","Type":"ContainerStarted","Data":"9736db70b45f7e6f3f750ae091237076377a90d5b47d639816161eca9a1ab904"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.732737 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nrc8s"] Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.775121 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" event={"ID":"39249ace-8681-491c-a547-869e72d297a7","Type":"ContainerStarted","Data":"36c78f51d09476173476e5ef9244865c7c61164271085db7b7de7ac5ba60c53f"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.795169 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:45 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:45 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:45 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.795230 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.798894 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.799346 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:46.299331461 +0000 UTC m=+241.549503369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.828112 4764 scope.go:117] "RemoveContainer" containerID="878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.829748 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6nhpx" event={"ID":"0c464093-f30e-4c79-84cd-98a6d49b813c","Type":"ContainerStarted","Data":"e2bb9836472cc420ee3d690a74378dd966f0cdbcabd65d5906255c9b974e9a59"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.829773 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.829783 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6nhpx" event={"ID":"0c464093-f30e-4c79-84cd-98a6d49b813c","Type":"ContainerStarted","Data":"c03e462730d7f8fe62be91c1a0cefc3fe604ceae04ca465686d1c317eddc75b9"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.831196 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.831233 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.834577 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d\": container with ID starting with 878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d not found: ID does not exist" containerID="878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.834609 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d"} err="failed to get container status \"878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d\": rpc error: code = NotFound desc = could not find container \"878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d\": container with ID starting with 878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d not found: ID does not exist" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.834670 4764 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-d4gwh container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.834691 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" podUID="1ccc5b44-95ad-4f4c-8086-c176c41bbd19" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.854484 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.901043 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.902090 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:46.402072904 +0000 UTC m=+241.652244812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.913449 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" podStartSLOduration=175.913426518 podStartE2EDuration="2m55.913426518s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:45.908435585 +0000 UTC m=+241.158607503" watchObservedRunningTime="2026-03-09 13:24:45.913426518 +0000 UTC m=+241.163598426" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.933111 4764 ???:1] "http: TLS handshake error from 192.168.126.11:48480: no serving certificate available for the kubelet" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.952632 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6nhpx" podStartSLOduration=9.952613262 podStartE2EDuration="9.952613262s" podCreationTimestamp="2026-03-09 13:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:45.952380176 +0000 UTC m=+241.202552104" watchObservedRunningTime="2026-03-09 13:24:45.952613262 +0000 UTC m=+241.202785170" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.002904 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.006701 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:46.506681966 +0000 UTC m=+241.756853954 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.107187 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.107841 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:46.607824875 +0000 UTC m=+241.857996783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: W0309 13:24:46.154107 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88ba6041_7f8f_48f0_840c_8ea2a9bdc87b.slice/crio-afb89024f1733e90994f05a617baca8bd2578c08f57794c20e8e0031fa2f63e4 WatchSource:0}: Error finding container afb89024f1733e90994f05a617baca8bd2578c08f57794c20e8e0031fa2f63e4: Status 404 returned error can't find the container with id afb89024f1733e90994f05a617baca8bd2578c08f57794c20e8e0031fa2f63e4 Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.155756 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8d627"] Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.158074 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" podStartSLOduration=175.158061216 podStartE2EDuration="2m55.158061216s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:46.106403857 +0000 UTC m=+241.356575765" watchObservedRunningTime="2026-03-09 13:24:46.158061216 +0000 UTC m=+241.408233124" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.208897 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.209266 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:46.709254943 +0000 UTC m=+241.959426851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.258722 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.276870 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.311173 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.311364 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:46.811333917 +0000 UTC m=+242.061505825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.311503 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.311817 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:46.81180549 +0000 UTC m=+242.061977398 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.412126 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.424924 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:46.924900491 +0000 UTC m=+242.175072399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.529885 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.530581 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.030566762 +0000 UTC m=+242.280738670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.535294 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mbm5b"] Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.548234 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jn8f5"] Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.632791 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.633247 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.133231172 +0000 UTC m=+242.383403080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.657480 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.734345 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.734871 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.234856914 +0000 UTC m=+242.485028822 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.776450 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qhs57"] Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.782849 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad49b4d8-1218-4a34-8455-831d0f563cbf" containerName="route-controller-manager" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.782940 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad49b4d8-1218-4a34-8455-831d0f563cbf" containerName="route-controller-manager" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.783105 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad49b4d8-1218-4a34-8455-831d0f563cbf" containerName="route-controller-manager" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.784070 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.786401 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.800857 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:46 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:46 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:46 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.800912 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.820484 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d896c677f-8g9dz"] Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.830778 4764 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-m4bs6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.831114 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" podUID="14b4aa8c-1066-4388-9442-07722e4c76c2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.833562 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhs57"] Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.845802 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.846183 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.346169037 +0000 UTC m=+242.596340945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.947367 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4z7x\" (UniqueName: \"kubernetes.io/projected/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-kube-api-access-j4z7x\") pod \"redhat-marketplace-qhs57\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.947439 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-catalog-content\") pod \"redhat-marketplace-qhs57\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.947461 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.947498 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-utilities\") pod \"redhat-marketplace-qhs57\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.947834 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.44781972 +0000 UTC m=+242.697991628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.979569 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbm5b" event={"ID":"20acdcb5-ea78-435e-b472-e102d5553c75","Type":"ContainerStarted","Data":"36b8a908fc96eec5fd19468146038ec7f847f96484b3a606a41defe1a23a894e"} Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.985799 4764 generic.go:334] "Generic (PLEG): container finished" podID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerID="fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a" exitCode=0 Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.986935 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrc8s" event={"ID":"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97","Type":"ContainerDied","Data":"fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a"} Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.987086 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrc8s" event={"ID":"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97","Type":"ContainerStarted","Data":"32332cee515b03550931490beaabd836e1f122b91e9186c7afe19395bde21caa"} Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.016678 4764 generic.go:334] "Generic (PLEG): container finished" podID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerID="1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f" exitCode=0 Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.016787 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d627" event={"ID":"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b","Type":"ContainerDied","Data":"1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f"} Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.016820 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d627" event={"ID":"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b","Type":"ContainerStarted","Data":"afb89024f1733e90994f05a617baca8bd2578c08f57794c20e8e0031fa2f63e4"} Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.054730 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn8f5" event={"ID":"a76121be-d090-4f2a-9e57-1a160a4bb4f2","Type":"ContainerStarted","Data":"a07c170a29ea8bcf9be266201f1dd0580d7bdb690c3b989b62809138bb677d6e"} Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.055891 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.055982 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4z7x\" (UniqueName: \"kubernetes.io/projected/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-kube-api-access-j4z7x\") pod \"redhat-marketplace-qhs57\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.056034 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-catalog-content\") pod \"redhat-marketplace-qhs57\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.056078 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-utilities\") pod \"redhat-marketplace-qhs57\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.057542 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-catalog-content\") pod \"redhat-marketplace-qhs57\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.057566 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-utilities\") pod \"redhat-marketplace-qhs57\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.057802 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.557781857 +0000 UTC m=+242.807953765 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.078979 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.114973 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g7k9k"] Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.120253 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.131914 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4z7x\" (UniqueName: \"kubernetes.io/projected/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-kube-api-access-j4z7x\") pod \"redhat-marketplace-qhs57\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.148172 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7k9k"] Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.156908 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.164984 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.165296 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.665281727 +0000 UTC m=+242.915453635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.209968 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r"] Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.220810 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r"] Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.252939 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.270997 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.271276 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-catalog-content\") pod \"redhat-marketplace-g7k9k\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.271407 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-utilities\") pod \"redhat-marketplace-g7k9k\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.271582 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx2jl\" (UniqueName: \"kubernetes.io/projected/7a967c79-e11e-4c58-b42e-652d1406ac88-kube-api-access-gx2jl\") pod \"redhat-marketplace-g7k9k\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.272255 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.772230612 +0000 UTC m=+243.022402560 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.324309 4764 ???:1] "http: TLS handshake error from 192.168.126.11:48496: no serving certificate available for the kubelet" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.372655 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-utilities\") pod \"redhat-marketplace-g7k9k\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.372830 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.372938 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx2jl\" (UniqueName: \"kubernetes.io/projected/7a967c79-e11e-4c58-b42e-652d1406ac88-kube-api-access-gx2jl\") pod \"redhat-marketplace-g7k9k\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.373037 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-catalog-content\") pod \"redhat-marketplace-g7k9k\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.373152 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.873139086 +0000 UTC m=+243.123310994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.374337 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-utilities\") pod \"redhat-marketplace-g7k9k\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.374372 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-catalog-content\") pod \"redhat-marketplace-g7k9k\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.417559 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx2jl\" (UniqueName: \"kubernetes.io/projected/7a967c79-e11e-4c58-b42e-652d1406ac88-kube-api-access-gx2jl\") pod \"redhat-marketplace-g7k9k\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.478956 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.479166 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.979131965 +0000 UTC m=+243.229303883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.479266 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.479607 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.979594698 +0000 UTC m=+243.229766596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.579901 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad49b4d8-1218-4a34-8455-831d0f563cbf" path="/var/lib/kubelet/pods/ad49b4d8-1218-4a34-8455-831d0f563cbf/volumes" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.581078 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.581499 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.081477467 +0000 UTC m=+243.331649375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.581581 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.582015 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.082006941 +0000 UTC m=+243.332178849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.602794 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.683063 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.683907 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.183841319 +0000 UTC m=+243.434013227 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.757980 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhs57"] Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.784763 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.785222 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.285209095 +0000 UTC m=+243.535381003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.791399 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:47 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:47 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:47 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.791449 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.831850 4764 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.891649 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.892108 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.392078108 +0000 UTC m=+243.642250016 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.892550 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.893020 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.393002133 +0000 UTC m=+243.643174041 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.894098 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tll5t"] Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.895233 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.900145 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.942380 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tll5t"] Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.994154 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.994741 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.494715168 +0000 UTC m=+243.744887076 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.078388 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9"] Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.079122 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.092185 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.093010 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.093344 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.093732 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.094023 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.095990 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-utilities\") pod \"redhat-operators-tll5t\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.096026 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-catalog-content\") pod \"redhat-operators-tll5t\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.096057 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v76nj\" (UniqueName: \"kubernetes.io/projected/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-kube-api-access-v76nj\") pod \"redhat-operators-tll5t\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.096113 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:48 crc kubenswrapper[4764]: E0309 13:24:48.096483 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.596468824 +0000 UTC m=+243.846640732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.127561 4764 generic.go:334] "Generic (PLEG): container finished" podID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerID="851e5f42b5e692a4d7bb4a0fda84945ae1aa93c4dc2838b29856d0c9ed98624f" exitCode=0 Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.128635 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn8f5" event={"ID":"a76121be-d090-4f2a-9e57-1a160a4bb4f2","Type":"ContainerDied","Data":"851e5f42b5e692a4d7bb4a0fda84945ae1aa93c4dc2838b29856d0c9ed98624f"} Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.143770 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.207294 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.207584 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35b2abcd-af84-40fe-8b37-90139612d63e-serving-cert\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.207611 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-utilities\") pod \"redhat-operators-tll5t\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.207632 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-client-ca\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.207673 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-catalog-content\") pod \"redhat-operators-tll5t\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.207720 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zpjf\" (UniqueName: \"kubernetes.io/projected/35b2abcd-af84-40fe-8b37-90139612d63e-kube-api-access-9zpjf\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.207748 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v76nj\" (UniqueName: \"kubernetes.io/projected/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-kube-api-access-v76nj\") pod \"redhat-operators-tll5t\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.207769 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-config\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: E0309 13:24:48.207916 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.70789895 +0000 UTC m=+243.958070858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.208270 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-utilities\") pod \"redhat-operators-tll5t\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.208510 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-catalog-content\") pod \"redhat-operators-tll5t\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.209543 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9"] Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.254915 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v76nj\" (UniqueName: \"kubernetes.io/projected/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-kube-api-access-v76nj\") pod \"redhat-operators-tll5t\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.279894 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.287807 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.309074 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zpjf\" (UniqueName: \"kubernetes.io/projected/35b2abcd-af84-40fe-8b37-90139612d63e-kube-api-access-9zpjf\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.309123 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-config\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.309192 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.309266 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35b2abcd-af84-40fe-8b37-90139612d63e-serving-cert\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.309296 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-client-ca\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.310886 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-client-ca\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: E0309 13:24:48.310991 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.810961531 +0000 UTC m=+244.061133589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.317253 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35b2abcd-af84-40fe-8b37-90139612d63e-serving-cert\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.321120 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-config\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.337779 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" event={"ID":"ef84d4f2-b722-415f-bc23-d472e00474b4","Type":"ContainerStarted","Data":"ac78e9706513103621b9ee866b0a26306b3671bd6884454e15e781cf3cf27583"} Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.337820 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" event={"ID":"ef84d4f2-b722-415f-bc23-d472e00474b4","Type":"ContainerStarted","Data":"bd1be1047066fde143bce3e434912476e75a2c14646016c14c7e52ccd0c2869e"} Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.337833 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d9z59"] Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.342977 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.343097 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.374740 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d9z59"] Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.380353 4764 generic.go:334] "Generic (PLEG): container finished" podID="20acdcb5-ea78-435e-b472-e102d5553c75" containerID="8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1" exitCode=0 Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.380542 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbm5b" event={"ID":"20acdcb5-ea78-435e-b472-e102d5553c75","Type":"ContainerDied","Data":"8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1"} Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.405274 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.408839 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zpjf\" (UniqueName: \"kubernetes.io/projected/35b2abcd-af84-40fe-8b37-90139612d63e-kube-api-access-9zpjf\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.410696 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:48 crc kubenswrapper[4764]: E0309 13:24:48.412834 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.912807749 +0000 UTC m=+244.162979657 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.417494 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" event={"ID":"e56f29e6-292c-48d2-a5d7-531cfa6689f5","Type":"ContainerStarted","Data":"8731eb91c9abf063f4e86c70dd77bf09c704c3d37e8f4078169fb9fff1053e33"} Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.417540 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" event={"ID":"e56f29e6-292c-48d2-a5d7-531cfa6689f5","Type":"ContainerStarted","Data":"2f35cd90db2216d308f85082aea56417dc1c891ee59bb6d91d54ae1fc4548ab6"} Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.440890 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhs57" event={"ID":"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df","Type":"ContainerStarted","Data":"05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead"} Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.440941 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhs57" event={"ID":"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df","Type":"ContainerStarted","Data":"eabffbe2f3a51c427a01ad46e2c40728c19297f3e8e305f2763268cbfbeb6ba0"} Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.462270 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" podStartSLOduration=5.462255909 podStartE2EDuration="5.462255909s" podCreationTimestamp="2026-03-09 13:24:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:48.461389845 +0000 UTC m=+243.711561763" watchObservedRunningTime="2026-03-09 13:24:48.462255909 +0000 UTC m=+243.712427817" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.465612 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.512919 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-catalog-content\") pod \"redhat-operators-d9z59\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.513026 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-utilities\") pod \"redhat-operators-d9z59\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.513077 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.513181 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stkxn\" (UniqueName: \"kubernetes.io/projected/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-kube-api-access-stkxn\") pod \"redhat-operators-d9z59\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: E0309 13:24:48.515491 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:49.015476219 +0000 UTC m=+244.265648127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.552794 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.552861 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.552878 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7k9k"] Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.556151 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.612542 4764 patch_prober.go:28] interesting pod/console-f9d7485db-8g9lj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.612606 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8g9lj" podUID="a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.617182 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.619586 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-catalog-content\") pod \"redhat-operators-d9z59\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.620316 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-utilities\") pod \"redhat-operators-d9z59\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.620428 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.620574 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stkxn\" (UniqueName: \"kubernetes.io/projected/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-kube-api-access-stkxn\") pod \"redhat-operators-d9z59\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: E0309 13:24:48.622815 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:49.122782715 +0000 UTC m=+244.372954623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.623988 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-catalog-content\") pod \"redhat-operators-d9z59\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.624443 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-utilities\") pod \"redhat-operators-d9z59\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.628236 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.647461 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stkxn\" (UniqueName: \"kubernetes.io/projected/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-kube-api-access-stkxn\") pod \"redhat-operators-d9z59\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.654337 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.683030 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.689465 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.724044 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:48 crc kubenswrapper[4764]: E0309 13:24:48.724471 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:49.224451518 +0000 UTC m=+244.474623426 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.763861 4764 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-09T13:24:47.83187509Z","Handler":null,"Name":""} Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.763998 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.787837 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.793440 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:48 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:48 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:48 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.793501 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.826228 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:48 crc kubenswrapper[4764]: E0309 13:24:48.826348 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:49.326329798 +0000 UTC m=+244.576501706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.826678 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:48 crc kubenswrapper[4764]: E0309 13:24:48.827229 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:49.327219952 +0000 UTC m=+244.577391860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.830523 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.854420 4764 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.854473 4764 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.928960 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.964063 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.030197 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.111118 4764 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.111162 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.218448 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.218901 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.219248 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.219265 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.250619 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.446873 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.455340 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.509066 4764 generic.go:334] "Generic (PLEG): container finished" podID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerID="05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead" exitCode=0 Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.509137 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhs57" event={"ID":"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df","Type":"ContainerDied","Data":"05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead"} Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.531064 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" event={"ID":"e56f29e6-292c-48d2-a5d7-531cfa6689f5","Type":"ContainerStarted","Data":"b83d080d807453dffd1053a90dd5bc9e2fdddf2a8f80094f255f56ecd12499fc"} Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.539446 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9"] Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.543541 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tll5t"] Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.548455 4764 generic.go:334] "Generic (PLEG): container finished" podID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerID="a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca" exitCode=0 Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.548692 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7k9k" event={"ID":"7a967c79-e11e-4c58-b42e-652d1406ac88","Type":"ContainerDied","Data":"a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca"} Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.548736 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7k9k" event={"ID":"7a967c79-e11e-4c58-b42e-652d1406ac88","Type":"ContainerStarted","Data":"2183e838c5144408fdc015b8deb0cb2c5e715404d51e8b64aa5f21859f0ebf3c"} Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.580865 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" podStartSLOduration=13.580823842000001 podStartE2EDuration="13.580823842s" podCreationTimestamp="2026-03-09 13:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:49.559041306 +0000 UTC m=+244.809213224" watchObservedRunningTime="2026-03-09 13:24:49.580823842 +0000 UTC m=+244.830995770" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.617082 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.796660 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:49 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:49 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:49 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.796741 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.855788 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d9z59"] Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.954295 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wkwdz"] Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.971987 4764 ???:1] "http: TLS handshake error from 192.168.126.11:57610: no serving certificate available for the kubelet" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.126269 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-w49hn"] Mar 09 13:24:50 crc kubenswrapper[4764]: W0309 13:24:50.173115 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3652fe0_4889_432f_af3f_787dd19c60d6.slice/crio-f63eb7186d86d6bf6062656b177ec8adc1e47100bcb490f0e226ebe524f4ea53 WatchSource:0}: Error finding container f63eb7186d86d6bf6062656b177ec8adc1e47100bcb490f0e226ebe524f4ea53: Status 404 returned error can't find the container with id f63eb7186d86d6bf6062656b177ec8adc1e47100bcb490f0e226ebe524f4ea53 Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.603284 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.604892 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.610386 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.610859 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.618776 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.675152 4764 generic.go:334] "Generic (PLEG): container finished" podID="39da5087-79bc-4154-b340-22183d9e4417" containerID="45383c97959a17ffdeaa9f0ab6e8a1110b113c90d84de0fa490663169c04fa26" exitCode=0 Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.675247 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" event={"ID":"39da5087-79bc-4154-b340-22183d9e4417","Type":"ContainerDied","Data":"45383c97959a17ffdeaa9f0ab6e8a1110b113c90d84de0fa490663169c04fa26"} Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.680432 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" event={"ID":"d3652fe0-4889-432f-af3f-787dd19c60d6","Type":"ContainerStarted","Data":"d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4"} Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.680479 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" event={"ID":"d3652fe0-4889-432f-af3f-787dd19c60d6","Type":"ContainerStarted","Data":"f63eb7186d86d6bf6062656b177ec8adc1e47100bcb490f0e226ebe524f4ea53"} Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.681314 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.683421 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" event={"ID":"6597fc34-10ee-4984-9c69-f4b7c0d46e2a","Type":"ContainerStarted","Data":"97f66587a61051c142b6607c2b2611b4d08ddb23635aab32a8f90392921c094a"} Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.686127 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" event={"ID":"35b2abcd-af84-40fe-8b37-90139612d63e","Type":"ContainerStarted","Data":"3bce33547ff50496fc00d452193e1617ac80b7c148e9afbaf542b8ac329a6ecf"} Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.686170 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" event={"ID":"35b2abcd-af84-40fe-8b37-90139612d63e","Type":"ContainerStarted","Data":"99e3df704f16abbd89cfa68b2dbe7a3e325602c3ff0a20c7684b7b091fc44203"} Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.686442 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.705201 4764 generic.go:334] "Generic (PLEG): container finished" podID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerID="d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2" exitCode=0 Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.705411 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9z59" event={"ID":"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2","Type":"ContainerDied","Data":"d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2"} Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.705469 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9z59" event={"ID":"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2","Type":"ContainerStarted","Data":"63639f892c3d7cc35dde0976454fc20f0b0dcd4c9977b4b39ee9f80a34190631"} Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.712347 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f756dee9-7011-49b3-8a60-b7e08f01972d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f756dee9-7011-49b3-8a60-b7e08f01972d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.712434 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f756dee9-7011-49b3-8a60-b7e08f01972d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f756dee9-7011-49b3-8a60-b7e08f01972d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.732947 4764 generic.go:334] "Generic (PLEG): container finished" podID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerID="ad5ce40518308f615c3da026f6c3fbf2d4af3ef9e2a050d739fa7a496bac2d88" exitCode=0 Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.733047 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tll5t" event={"ID":"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6","Type":"ContainerDied","Data":"ad5ce40518308f615c3da026f6c3fbf2d4af3ef9e2a050d739fa7a496bac2d88"} Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.733112 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tll5t" event={"ID":"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6","Type":"ContainerStarted","Data":"adf303b563119dba790a00aa6f3db90393d7970196ac5ffecf8fea14de83b469"} Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.764131 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" podStartSLOduration=180.764103366 podStartE2EDuration="3m0.764103366s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:50.755800823 +0000 UTC m=+246.005972731" watchObservedRunningTime="2026-03-09 13:24:50.764103366 +0000 UTC m=+246.014275274" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.803698 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:50 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:50 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:50 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.803755 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.813412 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f756dee9-7011-49b3-8a60-b7e08f01972d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f756dee9-7011-49b3-8a60-b7e08f01972d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.813643 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f756dee9-7011-49b3-8a60-b7e08f01972d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f756dee9-7011-49b3-8a60-b7e08f01972d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.815267 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f756dee9-7011-49b3-8a60-b7e08f01972d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f756dee9-7011-49b3-8a60-b7e08f01972d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.844002 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.872337 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f756dee9-7011-49b3-8a60-b7e08f01972d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f756dee9-7011-49b3-8a60-b7e08f01972d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.875379 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" podStartSLOduration=7.875366288 podStartE2EDuration="7.875366288s" podCreationTimestamp="2026-03-09 13:24:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:50.872756327 +0000 UTC m=+246.122928235" watchObservedRunningTime="2026-03-09 13:24:50.875366288 +0000 UTC m=+246.125538196" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.944338 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.488985 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.492686 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.496700 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.496869 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.497247 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.529386 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"27bcfe8c-a29f-4f0b-9f73-3e075a201db7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.529446 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"27bcfe8c-a29f-4f0b-9f73-3e075a201db7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.615764 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.631153 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"27bcfe8c-a29f-4f0b-9f73-3e075a201db7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.631197 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"27bcfe8c-a29f-4f0b-9f73-3e075a201db7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.631289 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"27bcfe8c-a29f-4f0b-9f73-3e075a201db7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:24:51 crc kubenswrapper[4764]: W0309 13:24:51.636460 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf756dee9_7011_49b3_8a60_b7e08f01972d.slice/crio-f47e9b814d8c689b9a403610e6c688065074a6f78e937c6390cbb84def8a7306 WatchSource:0}: Error finding container f47e9b814d8c689b9a403610e6c688065074a6f78e937c6390cbb84def8a7306: Status 404 returned error can't find the container with id f47e9b814d8c689b9a403610e6c688065074a6f78e937c6390cbb84def8a7306 Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.656730 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"27bcfe8c-a29f-4f0b-9f73-3e075a201db7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.792181 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:51 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:51 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:51 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.792253 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.803310 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" event={"ID":"6597fc34-10ee-4984-9c69-f4b7c0d46e2a","Type":"ContainerStarted","Data":"75d6a85048a1e2eb11458ae8be437eeb0a03f7941f5d32a79052f616b848b320"} Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.803352 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" event={"ID":"6597fc34-10ee-4984-9c69-f4b7c0d46e2a","Type":"ContainerStarted","Data":"11436f5431ce19fb1a9c2495b97d4e18b440349aa7e01e3feb8b315186104b47"} Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.820192 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f756dee9-7011-49b3-8a60-b7e08f01972d","Type":"ContainerStarted","Data":"f47e9b814d8c689b9a403610e6c688065074a6f78e937c6390cbb84def8a7306"} Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.824490 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wkwdz" podStartSLOduration=181.824478676 podStartE2EDuration="3m1.824478676s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:51.823601143 +0000 UTC m=+247.073773071" watchObservedRunningTime="2026-03-09 13:24:51.824478676 +0000 UTC m=+247.074650584" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.836561 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.523477 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.570526 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da5087-79bc-4154-b340-22183d9e4417-config-volume\") pod \"39da5087-79bc-4154-b340-22183d9e4417\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.570633 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss45h\" (UniqueName: \"kubernetes.io/projected/39da5087-79bc-4154-b340-22183d9e4417-kube-api-access-ss45h\") pod \"39da5087-79bc-4154-b340-22183d9e4417\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.570774 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39da5087-79bc-4154-b340-22183d9e4417-secret-volume\") pod \"39da5087-79bc-4154-b340-22183d9e4417\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.571445 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39da5087-79bc-4154-b340-22183d9e4417-config-volume" (OuterVolumeSpecName: "config-volume") pod "39da5087-79bc-4154-b340-22183d9e4417" (UID: "39da5087-79bc-4154-b340-22183d9e4417"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.588830 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39da5087-79bc-4154-b340-22183d9e4417-kube-api-access-ss45h" (OuterVolumeSpecName: "kube-api-access-ss45h") pod "39da5087-79bc-4154-b340-22183d9e4417" (UID: "39da5087-79bc-4154-b340-22183d9e4417"). InnerVolumeSpecName "kube-api-access-ss45h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.593854 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39da5087-79bc-4154-b340-22183d9e4417-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "39da5087-79bc-4154-b340-22183d9e4417" (UID: "39da5087-79bc-4154-b340-22183d9e4417"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.674574 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39da5087-79bc-4154-b340-22183d9e4417-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.674612 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da5087-79bc-4154-b340-22183d9e4417-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.674624 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss45h\" (UniqueName: \"kubernetes.io/projected/39da5087-79bc-4154-b340-22183d9e4417-kube-api-access-ss45h\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.730087 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.790631 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:52 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:52 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:52 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.790790 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.868192 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f756dee9-7011-49b3-8a60-b7e08f01972d","Type":"ContainerStarted","Data":"26fbafd0831fd9da487038da2d1a7850d1c8bfa104a1f78dfc720ff7c56f07b6"} Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.872876 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"27bcfe8c-a29f-4f0b-9f73-3e075a201db7","Type":"ContainerStarted","Data":"11da5fd652d318a343ea97d905dc9b8fda776729b232bdd361ea2f64c5a90799"} Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.887183 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" event={"ID":"39da5087-79bc-4154-b340-22183d9e4417","Type":"ContainerDied","Data":"c05318f0e67358bb015e1f76742485abff9f53469c740dc411704a1af2febb07"} Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.887232 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c05318f0e67358bb015e1f76742485abff9f53469c740dc411704a1af2febb07" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.887326 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.889621 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.8895801629999998 podStartE2EDuration="2.889580163s" podCreationTimestamp="2026-03-09 13:24:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:52.886317966 +0000 UTC m=+248.136489894" watchObservedRunningTime="2026-03-09 13:24:52.889580163 +0000 UTC m=+248.139752081" Mar 09 13:24:53 crc kubenswrapper[4764]: I0309 13:24:53.311114 4764 ???:1] "http: TLS handshake error from 192.168.126.11:57624: no serving certificate available for the kubelet" Mar 09 13:24:53 crc kubenswrapper[4764]: I0309 13:24:53.791077 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:53 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:53 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:53 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:53 crc kubenswrapper[4764]: I0309 13:24:53.791138 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:53 crc kubenswrapper[4764]: I0309 13:24:53.897125 4764 generic.go:334] "Generic (PLEG): container finished" podID="f756dee9-7011-49b3-8a60-b7e08f01972d" containerID="26fbafd0831fd9da487038da2d1a7850d1c8bfa104a1f78dfc720ff7c56f07b6" exitCode=0 Mar 09 13:24:53 crc kubenswrapper[4764]: I0309 13:24:53.897187 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f756dee9-7011-49b3-8a60-b7e08f01972d","Type":"ContainerDied","Data":"26fbafd0831fd9da487038da2d1a7850d1c8bfa104a1f78dfc720ff7c56f07b6"} Mar 09 13:24:54 crc kubenswrapper[4764]: I0309 13:24:54.495475 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:54 crc kubenswrapper[4764]: I0309 13:24:54.790445 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:54 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:54 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:54 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:54 crc kubenswrapper[4764]: I0309 13:24:54.790502 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:54 crc kubenswrapper[4764]: I0309 13:24:54.930361 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"27bcfe8c-a29f-4f0b-9f73-3e075a201db7","Type":"ContainerStarted","Data":"bd952e333a172908437fe8558e8fefd44c43923bd54e11143d085c1a5dd583ed"} Mar 09 13:24:54 crc kubenswrapper[4764]: I0309 13:24:54.948394 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.9483779759999997 podStartE2EDuration="3.948377976s" podCreationTimestamp="2026-03-09 13:24:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:54.943598808 +0000 UTC m=+250.193770716" watchObservedRunningTime="2026-03-09 13:24:54.948377976 +0000 UTC m=+250.198549884" Mar 09 13:24:55 crc kubenswrapper[4764]: I0309 13:24:55.138691 4764 ???:1] "http: TLS handshake error from 192.168.126.11:57640: no serving certificate available for the kubelet" Mar 09 13:24:55 crc kubenswrapper[4764]: I0309 13:24:55.479562 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:24:55 crc kubenswrapper[4764]: I0309 13:24:55.660503 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f756dee9-7011-49b3-8a60-b7e08f01972d-kubelet-dir\") pod \"f756dee9-7011-49b3-8a60-b7e08f01972d\" (UID: \"f756dee9-7011-49b3-8a60-b7e08f01972d\") " Mar 09 13:24:55 crc kubenswrapper[4764]: I0309 13:24:55.660674 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f756dee9-7011-49b3-8a60-b7e08f01972d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f756dee9-7011-49b3-8a60-b7e08f01972d" (UID: "f756dee9-7011-49b3-8a60-b7e08f01972d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:24:55 crc kubenswrapper[4764]: I0309 13:24:55.660718 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f756dee9-7011-49b3-8a60-b7e08f01972d-kube-api-access\") pod \"f756dee9-7011-49b3-8a60-b7e08f01972d\" (UID: \"f756dee9-7011-49b3-8a60-b7e08f01972d\") " Mar 09 13:24:55 crc kubenswrapper[4764]: I0309 13:24:55.661277 4764 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f756dee9-7011-49b3-8a60-b7e08f01972d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:55 crc kubenswrapper[4764]: I0309 13:24:55.670431 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f756dee9-7011-49b3-8a60-b7e08f01972d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f756dee9-7011-49b3-8a60-b7e08f01972d" (UID: "f756dee9-7011-49b3-8a60-b7e08f01972d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:55 crc kubenswrapper[4764]: I0309 13:24:55.762133 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f756dee9-7011-49b3-8a60-b7e08f01972d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:55 crc kubenswrapper[4764]: I0309 13:24:55.794862 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:55 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:55 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:55 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:55 crc kubenswrapper[4764]: I0309 13:24:55.794995 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:56 crc kubenswrapper[4764]: I0309 13:24:56.002701 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f756dee9-7011-49b3-8a60-b7e08f01972d","Type":"ContainerDied","Data":"f47e9b814d8c689b9a403610e6c688065074a6f78e937c6390cbb84def8a7306"} Mar 09 13:24:56 crc kubenswrapper[4764]: I0309 13:24:56.004799 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f47e9b814d8c689b9a403610e6c688065074a6f78e937c6390cbb84def8a7306" Mar 09 13:24:56 crc kubenswrapper[4764]: I0309 13:24:56.002719 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:24:56 crc kubenswrapper[4764]: I0309 13:24:56.796004 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:56 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:56 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:56 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:56 crc kubenswrapper[4764]: I0309 13:24:56.796086 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:57 crc kubenswrapper[4764]: I0309 13:24:57.042346 4764 generic.go:334] "Generic (PLEG): container finished" podID="27bcfe8c-a29f-4f0b-9f73-3e075a201db7" containerID="bd952e333a172908437fe8558e8fefd44c43923bd54e11143d085c1a5dd583ed" exitCode=0 Mar 09 13:24:57 crc kubenswrapper[4764]: I0309 13:24:57.042407 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"27bcfe8c-a29f-4f0b-9f73-3e075a201db7","Type":"ContainerDied","Data":"bd952e333a172908437fe8558e8fefd44c43923bd54e11143d085c1a5dd583ed"} Mar 09 13:24:57 crc kubenswrapper[4764]: I0309 13:24:57.793035 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:57 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:57 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:57 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:57 crc kubenswrapper[4764]: I0309 13:24:57.793103 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:58 crc kubenswrapper[4764]: I0309 13:24:58.371073 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:24:58 crc kubenswrapper[4764]: I0309 13:24:58.371350 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:24:58 crc kubenswrapper[4764]: I0309 13:24:58.530688 4764 patch_prober.go:28] interesting pod/console-f9d7485db-8g9lj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 09 13:24:58 crc kubenswrapper[4764]: I0309 13:24:58.531454 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8g9lj" podUID="a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 09 13:24:58 crc kubenswrapper[4764]: I0309 13:24:58.790212 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:58 crc kubenswrapper[4764]: [+]has-synced ok Mar 09 13:24:58 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:58 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:58 crc kubenswrapper[4764]: I0309 13:24:58.790287 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:58 crc kubenswrapper[4764]: I0309 13:24:58.792094 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:24:59 crc kubenswrapper[4764]: I0309 13:24:59.217728 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:24:59 crc kubenswrapper[4764]: I0309 13:24:59.217780 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:24:59 crc kubenswrapper[4764]: I0309 13:24:59.217840 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:24:59 crc kubenswrapper[4764]: I0309 13:24:59.217893 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:24:59 crc kubenswrapper[4764]: I0309 13:24:59.792145 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:59 crc kubenswrapper[4764]: I0309 13:24:59.796732 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:25:01 crc kubenswrapper[4764]: I0309 13:25:01.824632 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d896c677f-8g9dz"] Mar 09 13:25:01 crc kubenswrapper[4764]: I0309 13:25:01.824918 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" podUID="ef84d4f2-b722-415f-bc23-d472e00474b4" containerName="controller-manager" containerID="cri-o://ac78e9706513103621b9ee866b0a26306b3671bd6884454e15e781cf3cf27583" gracePeriod=30 Mar 09 13:25:01 crc kubenswrapper[4764]: I0309 13:25:01.851704 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9"] Mar 09 13:25:01 crc kubenswrapper[4764]: I0309 13:25:01.851911 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" podUID="35b2abcd-af84-40fe-8b37-90139612d63e" containerName="route-controller-manager" containerID="cri-o://3bce33547ff50496fc00d452193e1617ac80b7c148e9afbaf542b8ac329a6ecf" gracePeriod=30 Mar 09 13:25:03 crc kubenswrapper[4764]: I0309 13:25:03.121640 4764 generic.go:334] "Generic (PLEG): container finished" podID="ef84d4f2-b722-415f-bc23-d472e00474b4" containerID="ac78e9706513103621b9ee866b0a26306b3671bd6884454e15e781cf3cf27583" exitCode=0 Mar 09 13:25:03 crc kubenswrapper[4764]: I0309 13:25:03.121689 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" event={"ID":"ef84d4f2-b722-415f-bc23-d472e00474b4","Type":"ContainerDied","Data":"ac78e9706513103621b9ee866b0a26306b3671bd6884454e15e781cf3cf27583"} Mar 09 13:25:03 crc kubenswrapper[4764]: I0309 13:25:03.126125 4764 generic.go:334] "Generic (PLEG): container finished" podID="35b2abcd-af84-40fe-8b37-90139612d63e" containerID="3bce33547ff50496fc00d452193e1617ac80b7c148e9afbaf542b8ac329a6ecf" exitCode=0 Mar 09 13:25:03 crc kubenswrapper[4764]: I0309 13:25:03.126195 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" event={"ID":"35b2abcd-af84-40fe-8b37-90139612d63e","Type":"ContainerDied","Data":"3bce33547ff50496fc00d452193e1617ac80b7c148e9afbaf542b8ac329a6ecf"} Mar 09 13:25:05 crc kubenswrapper[4764]: I0309 13:25:05.606530 4764 patch_prober.go:28] interesting pod/controller-manager-5d896c677f-8g9dz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" start-of-body= Mar 09 13:25:05 crc kubenswrapper[4764]: I0309 13:25:05.606854 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" podUID="ef84d4f2-b722-415f-bc23-d472e00474b4" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" Mar 09 13:25:08 crc kubenswrapper[4764]: I0309 13:25:08.467106 4764 patch_prober.go:28] interesting pod/route-controller-manager-866db9688c-qwkl9 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 09 13:25:08 crc kubenswrapper[4764]: I0309 13:25:08.467154 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" podUID="35b2abcd-af84-40fe-8b37-90139612d63e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 09 13:25:08 crc kubenswrapper[4764]: I0309 13:25:08.535537 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:25:08 crc kubenswrapper[4764]: I0309 13:25:08.550277 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:25:09 crc kubenswrapper[4764]: I0309 13:25:09.217226 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:25:09 crc kubenswrapper[4764]: I0309 13:25:09.217519 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:25:09 crc kubenswrapper[4764]: I0309 13:25:09.217262 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:25:09 crc kubenswrapper[4764]: I0309 13:25:09.217655 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:25:09 crc kubenswrapper[4764]: I0309 13:25:09.217728 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-x927s" Mar 09 13:25:09 crc kubenswrapper[4764]: I0309 13:25:09.218307 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:25:09 crc kubenswrapper[4764]: I0309 13:25:09.218340 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:25:09 crc kubenswrapper[4764]: I0309 13:25:09.218931 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"4ed619baf68f65746ed98427321c72cd4adc6131be0c5ab50bca4ff635aa848c"} pod="openshift-console/downloads-7954f5f757-x927s" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 09 13:25:09 crc kubenswrapper[4764]: I0309 13:25:09.218994 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" containerID="cri-o://4ed619baf68f65746ed98427321c72cd4adc6131be0c5ab50bca4ff635aa848c" gracePeriod=2 Mar 09 13:25:09 crc kubenswrapper[4764]: I0309 13:25:09.463311 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.178398 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.205206 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"27bcfe8c-a29f-4f0b-9f73-3e075a201db7","Type":"ContainerDied","Data":"11da5fd652d318a343ea97d905dc9b8fda776729b232bdd361ea2f64c5a90799"} Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.205248 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11da5fd652d318a343ea97d905dc9b8fda776729b232bdd361ea2f64c5a90799" Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.205309 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.214246 4764 generic.go:334] "Generic (PLEG): container finished" podID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerID="4ed619baf68f65746ed98427321c72cd4adc6131be0c5ab50bca4ff635aa848c" exitCode=0 Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.214278 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-x927s" event={"ID":"d245b116-e47d-4b15-a40d-0a9fa34cf1df","Type":"ContainerDied","Data":"4ed619baf68f65746ed98427321c72cd4adc6131be0c5ab50bca4ff635aa848c"} Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.277801 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kube-api-access\") pod \"27bcfe8c-a29f-4f0b-9f73-3e075a201db7\" (UID: \"27bcfe8c-a29f-4f0b-9f73-3e075a201db7\") " Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.277965 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kubelet-dir\") pod \"27bcfe8c-a29f-4f0b-9f73-3e075a201db7\" (UID: \"27bcfe8c-a29f-4f0b-9f73-3e075a201db7\") " Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.278074 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "27bcfe8c-a29f-4f0b-9f73-3e075a201db7" (UID: "27bcfe8c-a29f-4f0b-9f73-3e075a201db7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.278438 4764 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.283045 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "27bcfe8c-a29f-4f0b-9f73-3e075a201db7" (UID: "27bcfe8c-a29f-4f0b-9f73-3e075a201db7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.380933 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:15 crc kubenswrapper[4764]: I0309 13:25:15.646781 4764 ???:1] "http: TLS handshake error from 192.168.126.11:47958: no serving certificate available for the kubelet" Mar 09 13:25:16 crc kubenswrapper[4764]: I0309 13:25:16.606238 4764 patch_prober.go:28] interesting pod/controller-manager-5d896c677f-8g9dz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:25:16 crc kubenswrapper[4764]: I0309 13:25:16.606621 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" podUID="ef84d4f2-b722-415f-bc23-d472e00474b4" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.347289 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.353400 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.382965 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35b2abcd-af84-40fe-8b37-90139612d63e-serving-cert\") pod \"35b2abcd-af84-40fe-8b37-90139612d63e\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.383048 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-client-ca\") pod \"35b2abcd-af84-40fe-8b37-90139612d63e\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.383089 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-config\") pod \"35b2abcd-af84-40fe-8b37-90139612d63e\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.383147 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zpjf\" (UniqueName: \"kubernetes.io/projected/35b2abcd-af84-40fe-8b37-90139612d63e-kube-api-access-9zpjf\") pod \"35b2abcd-af84-40fe-8b37-90139612d63e\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.390681 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-client-ca" (OuterVolumeSpecName: "client-ca") pod "35b2abcd-af84-40fe-8b37-90139612d63e" (UID: "35b2abcd-af84-40fe-8b37-90139612d63e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.391074 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-config" (OuterVolumeSpecName: "config") pod "35b2abcd-af84-40fe-8b37-90139612d63e" (UID: "35b2abcd-af84-40fe-8b37-90139612d63e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.391210 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35b2abcd-af84-40fe-8b37-90139612d63e-kube-api-access-9zpjf" (OuterVolumeSpecName: "kube-api-access-9zpjf") pod "35b2abcd-af84-40fe-8b37-90139612d63e" (UID: "35b2abcd-af84-40fe-8b37-90139612d63e"). InnerVolumeSpecName "kube-api-access-9zpjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.394188 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b2abcd-af84-40fe-8b37-90139612d63e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "35b2abcd-af84-40fe-8b37-90139612d63e" (UID: "35b2abcd-af84-40fe-8b37-90139612d63e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396316 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j"] Mar 09 13:25:17 crc kubenswrapper[4764]: E0309 13:25:17.396718 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef84d4f2-b722-415f-bc23-d472e00474b4" containerName="controller-manager" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396739 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef84d4f2-b722-415f-bc23-d472e00474b4" containerName="controller-manager" Mar 09 13:25:17 crc kubenswrapper[4764]: E0309 13:25:17.396756 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f756dee9-7011-49b3-8a60-b7e08f01972d" containerName="pruner" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396765 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f756dee9-7011-49b3-8a60-b7e08f01972d" containerName="pruner" Mar 09 13:25:17 crc kubenswrapper[4764]: E0309 13:25:17.396778 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39da5087-79bc-4154-b340-22183d9e4417" containerName="collect-profiles" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396786 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="39da5087-79bc-4154-b340-22183d9e4417" containerName="collect-profiles" Mar 09 13:25:17 crc kubenswrapper[4764]: E0309 13:25:17.396798 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35b2abcd-af84-40fe-8b37-90139612d63e" containerName="route-controller-manager" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396806 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b2abcd-af84-40fe-8b37-90139612d63e" containerName="route-controller-manager" Mar 09 13:25:17 crc kubenswrapper[4764]: E0309 13:25:17.396814 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27bcfe8c-a29f-4f0b-9f73-3e075a201db7" containerName="pruner" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396822 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="27bcfe8c-a29f-4f0b-9f73-3e075a201db7" containerName="pruner" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396945 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="35b2abcd-af84-40fe-8b37-90139612d63e" containerName="route-controller-manager" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396964 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="39da5087-79bc-4154-b340-22183d9e4417" containerName="collect-profiles" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396976 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f756dee9-7011-49b3-8a60-b7e08f01972d" containerName="pruner" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396988 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef84d4f2-b722-415f-bc23-d472e00474b4" containerName="controller-manager" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396998 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="27bcfe8c-a29f-4f0b-9f73-3e075a201db7" containerName="pruner" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.397481 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.400218 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j"] Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.484166 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-config\") pod \"ef84d4f2-b722-415f-bc23-d472e00474b4\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.484236 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef84d4f2-b722-415f-bc23-d472e00474b4-serving-cert\") pod \"ef84d4f2-b722-415f-bc23-d472e00474b4\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.484265 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp2v7\" (UniqueName: \"kubernetes.io/projected/ef84d4f2-b722-415f-bc23-d472e00474b4-kube-api-access-jp2v7\") pod \"ef84d4f2-b722-415f-bc23-d472e00474b4\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.484356 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-client-ca\") pod \"ef84d4f2-b722-415f-bc23-d472e00474b4\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.484417 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-proxy-ca-bundles\") pod \"ef84d4f2-b722-415f-bc23-d472e00474b4\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.484609 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6d6p\" (UniqueName: \"kubernetes.io/projected/63fc6d8b-caee-491b-8434-f40958b590d5-kube-api-access-m6d6p\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.484655 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-client-ca\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.484719 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-config\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.485131 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fc6d8b-caee-491b-8434-f40958b590d5-serving-cert\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.485340 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35b2abcd-af84-40fe-8b37-90139612d63e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.485355 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.485365 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.485376 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zpjf\" (UniqueName: \"kubernetes.io/projected/35b2abcd-af84-40fe-8b37-90139612d63e-kube-api-access-9zpjf\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.485406 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-client-ca" (OuterVolumeSpecName: "client-ca") pod "ef84d4f2-b722-415f-bc23-d472e00474b4" (UID: "ef84d4f2-b722-415f-bc23-d472e00474b4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.485466 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ef84d4f2-b722-415f-bc23-d472e00474b4" (UID: "ef84d4f2-b722-415f-bc23-d472e00474b4"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.485875 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-config" (OuterVolumeSpecName: "config") pod "ef84d4f2-b722-415f-bc23-d472e00474b4" (UID: "ef84d4f2-b722-415f-bc23-d472e00474b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.487504 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef84d4f2-b722-415f-bc23-d472e00474b4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ef84d4f2-b722-415f-bc23-d472e00474b4" (UID: "ef84d4f2-b722-415f-bc23-d472e00474b4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.487935 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef84d4f2-b722-415f-bc23-d472e00474b4-kube-api-access-jp2v7" (OuterVolumeSpecName: "kube-api-access-jp2v7") pod "ef84d4f2-b722-415f-bc23-d472e00474b4" (UID: "ef84d4f2-b722-415f-bc23-d472e00474b4"). InnerVolumeSpecName "kube-api-access-jp2v7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.586153 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fc6d8b-caee-491b-8434-f40958b590d5-serving-cert\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.586214 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6d6p\" (UniqueName: \"kubernetes.io/projected/63fc6d8b-caee-491b-8434-f40958b590d5-kube-api-access-m6d6p\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.586237 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-client-ca\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.586284 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-config\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.586338 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef84d4f2-b722-415f-bc23-d472e00474b4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.586349 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jp2v7\" (UniqueName: \"kubernetes.io/projected/ef84d4f2-b722-415f-bc23-d472e00474b4-kube-api-access-jp2v7\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.586362 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.586370 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.586378 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.587614 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-config\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.588494 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-client-ca\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.599833 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fc6d8b-caee-491b-8434-f40958b590d5-serving-cert\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.607409 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6d6p\" (UniqueName: \"kubernetes.io/projected/63fc6d8b-caee-491b-8434-f40958b590d5-kube-api-access-m6d6p\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.740199 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:18 crc kubenswrapper[4764]: I0309 13:25:18.263741 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" event={"ID":"ef84d4f2-b722-415f-bc23-d472e00474b4","Type":"ContainerDied","Data":"bd1be1047066fde143bce3e434912476e75a2c14646016c14c7e52ccd0c2869e"} Mar 09 13:25:18 crc kubenswrapper[4764]: I0309 13:25:18.263762 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:25:18 crc kubenswrapper[4764]: I0309 13:25:18.264105 4764 scope.go:117] "RemoveContainer" containerID="ac78e9706513103621b9ee866b0a26306b3671bd6884454e15e781cf3cf27583" Mar 09 13:25:18 crc kubenswrapper[4764]: I0309 13:25:18.268752 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" event={"ID":"35b2abcd-af84-40fe-8b37-90139612d63e","Type":"ContainerDied","Data":"99e3df704f16abbd89cfa68b2dbe7a3e325602c3ff0a20c7684b7b091fc44203"} Mar 09 13:25:18 crc kubenswrapper[4764]: I0309 13:25:18.268884 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:25:18 crc kubenswrapper[4764]: I0309 13:25:18.283903 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d896c677f-8g9dz"] Mar 09 13:25:18 crc kubenswrapper[4764]: I0309 13:25:18.289252 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5d896c677f-8g9dz"] Mar 09 13:25:18 crc kubenswrapper[4764]: I0309 13:25:18.296691 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9"] Mar 09 13:25:18 crc kubenswrapper[4764]: I0309 13:25:18.299436 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9"] Mar 09 13:25:18 crc kubenswrapper[4764]: I0309 13:25:18.696879 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" Mar 09 13:25:19 crc kubenswrapper[4764]: I0309 13:25:19.217010 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:25:19 crc kubenswrapper[4764]: I0309 13:25:19.217418 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:25:19 crc kubenswrapper[4764]: I0309 13:25:19.572691 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35b2abcd-af84-40fe-8b37-90139612d63e" path="/var/lib/kubelet/pods/35b2abcd-af84-40fe-8b37-90139612d63e/volumes" Mar 09 13:25:19 crc kubenswrapper[4764]: I0309 13:25:19.573414 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef84d4f2-b722-415f-bc23-d472e00474b4" path="/var/lib/kubelet/pods/ef84d4f2-b722-415f-bc23-d472e00474b4/volumes" Mar 09 13:25:19 crc kubenswrapper[4764]: E0309 13:25:19.848112 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 09 13:25:19 crc kubenswrapper[4764]: E0309 13:25:19.848333 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ff292,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-nrc8s_openshift-marketplace(be22cbfb-d3e7-43c1-be38-f6fcadeb2c97): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:25:19 crc kubenswrapper[4764]: E0309 13:25:19.849535 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-nrc8s" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.825922 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7f6c57f99f-67zw4"] Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.826728 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.829782 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.830021 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.830137 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.830248 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.830341 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.830485 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.837507 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.841029 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f6c57f99f-67zw4"] Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.897312 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j"] Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.950667 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-config\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.950783 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r96c8\" (UniqueName: \"kubernetes.io/projected/e4ef7ddb-4b54-48f7-a879-d07cb3222339-kube-api-access-r96c8\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.950813 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-client-ca\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.950872 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ef7ddb-4b54-48f7-a879-d07cb3222339-serving-cert\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.950932 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-proxy-ca-bundles\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.052384 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ef7ddb-4b54-48f7-a879-d07cb3222339-serving-cert\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.052497 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-proxy-ca-bundles\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.052550 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-config\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.052587 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r96c8\" (UniqueName: \"kubernetes.io/projected/e4ef7ddb-4b54-48f7-a879-d07cb3222339-kube-api-access-r96c8\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.052614 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-client-ca\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.054412 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-config\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.059619 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-client-ca\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.062500 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ef7ddb-4b54-48f7-a879-d07cb3222339-serving-cert\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.063563 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-proxy-ca-bundles\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.077576 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r96c8\" (UniqueName: \"kubernetes.io/projected/e4ef7ddb-4b54-48f7-a879-d07cb3222339-kube-api-access-r96c8\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.145563 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:24 crc kubenswrapper[4764]: E0309 13:25:24.784361 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-nrc8s" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" Mar 09 13:25:24 crc kubenswrapper[4764]: E0309 13:25:24.877188 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 09 13:25:24 crc kubenswrapper[4764]: E0309 13:25:24.877357 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v76nj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-tll5t_openshift-marketplace(41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:25:24 crc kubenswrapper[4764]: E0309 13:25:24.879417 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-tll5t" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.085997 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.087563 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.094317 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.094597 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.097181 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.200061 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/048cea49-847b-4232-9ece-3656fccc1909-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"048cea49-847b-4232-9ece-3656fccc1909\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.200137 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/048cea49-847b-4232-9ece-3656fccc1909-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"048cea49-847b-4232-9ece-3656fccc1909\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.301716 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/048cea49-847b-4232-9ece-3656fccc1909-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"048cea49-847b-4232-9ece-3656fccc1909\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.301805 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/048cea49-847b-4232-9ece-3656fccc1909-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"048cea49-847b-4232-9ece-3656fccc1909\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.301984 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/048cea49-847b-4232-9ece-3656fccc1909-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"048cea49-847b-4232-9ece-3656fccc1909\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.331497 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/048cea49-847b-4232-9ece-3656fccc1909-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"048cea49-847b-4232-9ece-3656fccc1909\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.426045 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:25:27 crc kubenswrapper[4764]: E0309 13:25:27.405913 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-tll5t" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" Mar 09 13:25:27 crc kubenswrapper[4764]: E0309 13:25:27.491321 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 09 13:25:27 crc kubenswrapper[4764]: E0309 13:25:27.491687 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5fsvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-8d627_openshift-marketplace(88ba6041-7f8f-48f0-840c-8ea2a9bdc87b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:25:27 crc kubenswrapper[4764]: E0309 13:25:27.493007 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-8d627" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" Mar 09 13:25:27 crc kubenswrapper[4764]: E0309 13:25:27.504128 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 09 13:25:27 crc kubenswrapper[4764]: E0309 13:25:27.504299 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-stkxn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-d9z59_openshift-marketplace(c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:25:27 crc kubenswrapper[4764]: E0309 13:25:27.505839 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-d9z59" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" Mar 09 13:25:27 crc kubenswrapper[4764]: E0309 13:25:27.506634 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 09 13:25:27 crc kubenswrapper[4764]: E0309 13:25:27.506804 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dc5m2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-jn8f5_openshift-marketplace(a76121be-d090-4f2a-9e57-1a160a4bb4f2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:25:27 crc kubenswrapper[4764]: E0309 13:25:27.508020 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-jn8f5" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" Mar 09 13:25:28 crc kubenswrapper[4764]: I0309 13:25:28.369953 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:25:28 crc kubenswrapper[4764]: I0309 13:25:28.370022 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.886050 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-8d627" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.886335 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jn8f5" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.886454 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-d9z59" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.956164 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.956387 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gx2jl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-g7k9k_openshift-marketplace(7a967c79-e11e-4c58-b42e-652d1406ac88): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.957539 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-g7k9k" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.973006 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.973149 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j4z7x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-qhs57_openshift-marketplace(691ffa6f-3ee6-47fa-bcef-9fdd74ac86df): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.974404 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-qhs57" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.985711 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.985913 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f4s9m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-mbm5b_openshift-marketplace(20acdcb5-ea78-435e-b472-e102d5553c75): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.987074 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-mbm5b" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" Mar 09 13:25:29 crc kubenswrapper[4764]: I0309 13:25:29.218760 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:25:29 crc kubenswrapper[4764]: I0309 13:25:29.218834 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:25:29 crc kubenswrapper[4764]: E0309 13:25:29.844085 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-g7k9k" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" Mar 09 13:25:29 crc kubenswrapper[4764]: E0309 13:25:29.844140 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-mbm5b" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" Mar 09 13:25:29 crc kubenswrapper[4764]: E0309 13:25:29.844225 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-qhs57" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" Mar 09 13:25:29 crc kubenswrapper[4764]: I0309 13:25:29.873000 4764 scope.go:117] "RemoveContainer" containerID="3bce33547ff50496fc00d452193e1617ac80b7c148e9afbaf542b8ac329a6ecf" Mar 09 13:25:29 crc kubenswrapper[4764]: E0309 13:25:29.876256 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 09 13:25:29 crc kubenswrapper[4764]: E0309 13:25:29.876366 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:25:29 crc kubenswrapper[4764]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 09 13:25:29 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g47h8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29551044-p748f_openshift-infra(0a005f65-920a-4cdd-b4da-a270953113aa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 09 13:25:29 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:25:29 crc kubenswrapper[4764]: E0309 13:25:29.877473 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29551044-p748f" podUID="0a005f65-920a-4cdd-b4da-a270953113aa" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.106127 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.113309 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.132930 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.182579 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.182668 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kube-api-access\") pod \"installer-9-crc\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.182725 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-var-lock\") pod \"installer-9-crc\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.284515 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.284575 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kube-api-access\") pod \"installer-9-crc\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.284620 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-var-lock\") pod \"installer-9-crc\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.284698 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.284710 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-var-lock\") pod \"installer-9-crc\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.305615 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kube-api-access\") pod \"installer-9-crc\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.336936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-x927s" event={"ID":"d245b116-e47d-4b15-a40d-0a9fa34cf1df","Type":"ContainerStarted","Data":"599976ba410eaba88a00e5c0f730b5c1ba416c87b24810e39748b0b6bec77a15"} Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.337178 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-x927s" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.339432 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.339486 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:25:30 crc kubenswrapper[4764]: E0309 13:25:30.342780 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29551044-p748f" podUID="0a005f65-920a-4cdd-b4da-a270953113aa" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.394127 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f6c57f99f-67zw4"] Mar 09 13:25:30 crc kubenswrapper[4764]: W0309 13:25:30.408868 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4ef7ddb_4b54_48f7_a879_d07cb3222339.slice/crio-6dfe527522407abed5825087faeed3aa6722bd0b6c51839e5410a49d638c18ec WatchSource:0}: Error finding container 6dfe527522407abed5825087faeed3aa6722bd0b6c51839e5410a49d638c18ec: Status 404 returned error can't find the container with id 6dfe527522407abed5825087faeed3aa6722bd0b6c51839e5410a49d638c18ec Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.442993 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.456193 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.460394 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j"] Mar 09 13:25:30 crc kubenswrapper[4764]: W0309 13:25:30.470572 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod048cea49_847b_4232_9ece_3656fccc1909.slice/crio-22a8a78a0997f7beba52c53801155d1c792168145e66312da717d6dc41ad0310 WatchSource:0}: Error finding container 22a8a78a0997f7beba52c53801155d1c792168145e66312da717d6dc41ad0310: Status 404 returned error can't find the container with id 22a8a78a0997f7beba52c53801155d1c792168145e66312da717d6dc41ad0310 Mar 09 13:25:30 crc kubenswrapper[4764]: W0309 13:25:30.475579 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63fc6d8b_caee_491b_8434_f40958b590d5.slice/crio-770029a71edac963fedf423b75b89193eb45638286f1adaa5427198f1f82796f WatchSource:0}: Error finding container 770029a71edac963fedf423b75b89193eb45638286f1adaa5427198f1f82796f: Status 404 returned error can't find the container with id 770029a71edac963fedf423b75b89193eb45638286f1adaa5427198f1f82796f Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.929558 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 13:25:30 crc kubenswrapper[4764]: W0309 13:25:30.947676 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6079e5ed_2acc_42f5_a62e_ea2a98b18abd.slice/crio-2b27f2baa5d5f2b2cb7d6fda09a6905faac786ec6165e8062e44b7064fdcbfbc WatchSource:0}: Error finding container 2b27f2baa5d5f2b2cb7d6fda09a6905faac786ec6165e8062e44b7064fdcbfbc: Status 404 returned error can't find the container with id 2b27f2baa5d5f2b2cb7d6fda09a6905faac786ec6165e8062e44b7064fdcbfbc Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.348162 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" event={"ID":"e4ef7ddb-4b54-48f7-a879-d07cb3222339","Type":"ContainerStarted","Data":"c13a9bf60f9efa3f676a6a2e63469b91bb787e084e0911f7a2d49d61a829e9cf"} Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.348700 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" event={"ID":"e4ef7ddb-4b54-48f7-a879-d07cb3222339","Type":"ContainerStarted","Data":"6dfe527522407abed5825087faeed3aa6722bd0b6c51839e5410a49d638c18ec"} Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.348731 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.350173 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"048cea49-847b-4232-9ece-3656fccc1909","Type":"ContainerStarted","Data":"73c91c8f718fb65e017e620c7103bee44d1903f405abd54c0038fd8acbf6dc05"} Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.350244 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"048cea49-847b-4232-9ece-3656fccc1909","Type":"ContainerStarted","Data":"22a8a78a0997f7beba52c53801155d1c792168145e66312da717d6dc41ad0310"} Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.352381 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" event={"ID":"63fc6d8b-caee-491b-8434-f40958b590d5","Type":"ContainerStarted","Data":"0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4"} Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.352415 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" event={"ID":"63fc6d8b-caee-491b-8434-f40958b590d5","Type":"ContainerStarted","Data":"770029a71edac963fedf423b75b89193eb45638286f1adaa5427198f1f82796f"} Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.352524 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" podUID="63fc6d8b-caee-491b-8434-f40958b590d5" containerName="route-controller-manager" containerID="cri-o://0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4" gracePeriod=30 Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.352638 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.354031 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6079e5ed-2acc-42f5-a62e-ea2a98b18abd","Type":"ContainerStarted","Data":"2b27f2baa5d5f2b2cb7d6fda09a6905faac786ec6165e8062e44b7064fdcbfbc"} Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.354763 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.354833 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.360858 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.362522 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.375034 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" podStartSLOduration=10.375005872 podStartE2EDuration="10.375005872s" podCreationTimestamp="2026-03-09 13:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:31.370806238 +0000 UTC m=+286.620978156" watchObservedRunningTime="2026-03-09 13:25:31.375005872 +0000 UTC m=+286.625177780" Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.409493 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=6.409465675 podStartE2EDuration="6.409465675s" podCreationTimestamp="2026-03-09 13:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:31.4089227 +0000 UTC m=+286.659094618" watchObservedRunningTime="2026-03-09 13:25:31.409465675 +0000 UTC m=+286.659637603" Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.457515 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" podStartSLOduration=30.457480167 podStartE2EDuration="30.457480167s" podCreationTimestamp="2026-03-09 13:25:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:31.453106118 +0000 UTC m=+286.703278046" watchObservedRunningTime="2026-03-09 13:25:31.457480167 +0000 UTC m=+286.707652085" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.254845 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.288981 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj"] Mar 09 13:25:32 crc kubenswrapper[4764]: E0309 13:25:32.289291 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63fc6d8b-caee-491b-8434-f40958b590d5" containerName="route-controller-manager" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.289307 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="63fc6d8b-caee-491b-8434-f40958b590d5" containerName="route-controller-manager" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.289443 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="63fc6d8b-caee-491b-8434-f40958b590d5" containerName="route-controller-manager" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.289962 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.299765 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj"] Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.318178 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fc6d8b-caee-491b-8434-f40958b590d5-serving-cert\") pod \"63fc6d8b-caee-491b-8434-f40958b590d5\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.318248 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6d6p\" (UniqueName: \"kubernetes.io/projected/63fc6d8b-caee-491b-8434-f40958b590d5-kube-api-access-m6d6p\") pod \"63fc6d8b-caee-491b-8434-f40958b590d5\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.318270 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-client-ca\") pod \"63fc6d8b-caee-491b-8434-f40958b590d5\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.318314 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-config\") pod \"63fc6d8b-caee-491b-8434-f40958b590d5\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.318529 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/556e4318-98c6-4910-830a-2edcafa8c5a3-serving-cert\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.318591 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-client-ca\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.318695 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqfnq\" (UniqueName: \"kubernetes.io/projected/556e4318-98c6-4910-830a-2edcafa8c5a3-kube-api-access-rqfnq\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.318723 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-config\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.319505 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-client-ca" (OuterVolumeSpecName: "client-ca") pod "63fc6d8b-caee-491b-8434-f40958b590d5" (UID: "63fc6d8b-caee-491b-8434-f40958b590d5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.319579 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-config" (OuterVolumeSpecName: "config") pod "63fc6d8b-caee-491b-8434-f40958b590d5" (UID: "63fc6d8b-caee-491b-8434-f40958b590d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.325373 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63fc6d8b-caee-491b-8434-f40958b590d5-kube-api-access-m6d6p" (OuterVolumeSpecName: "kube-api-access-m6d6p") pod "63fc6d8b-caee-491b-8434-f40958b590d5" (UID: "63fc6d8b-caee-491b-8434-f40958b590d5"). InnerVolumeSpecName "kube-api-access-m6d6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.326384 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63fc6d8b-caee-491b-8434-f40958b590d5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "63fc6d8b-caee-491b-8434-f40958b590d5" (UID: "63fc6d8b-caee-491b-8434-f40958b590d5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.361410 4764 generic.go:334] "Generic (PLEG): container finished" podID="048cea49-847b-4232-9ece-3656fccc1909" containerID="73c91c8f718fb65e017e620c7103bee44d1903f405abd54c0038fd8acbf6dc05" exitCode=0 Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.361489 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"048cea49-847b-4232-9ece-3656fccc1909","Type":"ContainerDied","Data":"73c91c8f718fb65e017e620c7103bee44d1903f405abd54c0038fd8acbf6dc05"} Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.363227 4764 generic.go:334] "Generic (PLEG): container finished" podID="63fc6d8b-caee-491b-8434-f40958b590d5" containerID="0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4" exitCode=0 Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.363266 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" event={"ID":"63fc6d8b-caee-491b-8434-f40958b590d5","Type":"ContainerDied","Data":"0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4"} Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.363292 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.363324 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" event={"ID":"63fc6d8b-caee-491b-8434-f40958b590d5","Type":"ContainerDied","Data":"770029a71edac963fedf423b75b89193eb45638286f1adaa5427198f1f82796f"} Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.363341 4764 scope.go:117] "RemoveContainer" containerID="0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.364869 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6079e5ed-2acc-42f5-a62e-ea2a98b18abd","Type":"ContainerStarted","Data":"a9112ccd1b21485b0fe1f8f0b1cfb2709637cb4e00bc4c414110531d66291dd4"} Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.400146 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.400124528 podStartE2EDuration="2.400124528s" podCreationTimestamp="2026-03-09 13:25:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:32.396835028 +0000 UTC m=+287.647006956" watchObservedRunningTime="2026-03-09 13:25:32.400124528 +0000 UTC m=+287.650296446" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.404551 4764 scope.go:117] "RemoveContainer" containerID="0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4" Mar 09 13:25:32 crc kubenswrapper[4764]: E0309 13:25:32.406235 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4\": container with ID starting with 0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4 not found: ID does not exist" containerID="0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.406273 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4"} err="failed to get container status \"0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4\": rpc error: code = NotFound desc = could not find container \"0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4\": container with ID starting with 0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4 not found: ID does not exist" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.416116 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j"] Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.420379 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/556e4318-98c6-4910-830a-2edcafa8c5a3-serving-cert\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.420489 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-client-ca\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.420566 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqfnq\" (UniqueName: \"kubernetes.io/projected/556e4318-98c6-4910-830a-2edcafa8c5a3-kube-api-access-rqfnq\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.420594 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-config\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.420671 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fc6d8b-caee-491b-8434-f40958b590d5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.420683 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6d6p\" (UniqueName: \"kubernetes.io/projected/63fc6d8b-caee-491b-8434-f40958b590d5-kube-api-access-m6d6p\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.420693 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.420702 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.421590 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-client-ca\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.421746 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-config\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.424059 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/556e4318-98c6-4910-830a-2edcafa8c5a3-serving-cert\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.431005 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j"] Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.440853 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqfnq\" (UniqueName: \"kubernetes.io/projected/556e4318-98c6-4910-830a-2edcafa8c5a3-kube-api-access-rqfnq\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.621421 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.071369 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj"] Mar 09 13:25:33 crc kubenswrapper[4764]: W0309 13:25:33.078143 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod556e4318_98c6_4910_830a_2edcafa8c5a3.slice/crio-335a472e98db78b509578c161edffb85e2cda67cc112f2084014fb62974f8645 WatchSource:0}: Error finding container 335a472e98db78b509578c161edffb85e2cda67cc112f2084014fb62974f8645: Status 404 returned error can't find the container with id 335a472e98db78b509578c161edffb85e2cda67cc112f2084014fb62974f8645 Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.373694 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" event={"ID":"556e4318-98c6-4910-830a-2edcafa8c5a3","Type":"ContainerStarted","Data":"02fabc10f8ca5b240e9914fe91984e0b3ed4f53f2c9e25d78b0f654f639a110c"} Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.373751 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" event={"ID":"556e4318-98c6-4910-830a-2edcafa8c5a3","Type":"ContainerStarted","Data":"335a472e98db78b509578c161edffb85e2cda67cc112f2084014fb62974f8645"} Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.566666 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63fc6d8b-caee-491b-8434-f40958b590d5" path="/var/lib/kubelet/pods/63fc6d8b-caee-491b-8434-f40958b590d5/volumes" Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.654467 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.668670 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/048cea49-847b-4232-9ece-3656fccc1909-kubelet-dir\") pod \"048cea49-847b-4232-9ece-3656fccc1909\" (UID: \"048cea49-847b-4232-9ece-3656fccc1909\") " Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.668721 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/048cea49-847b-4232-9ece-3656fccc1909-kube-api-access\") pod \"048cea49-847b-4232-9ece-3656fccc1909\" (UID: \"048cea49-847b-4232-9ece-3656fccc1909\") " Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.670049 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/048cea49-847b-4232-9ece-3656fccc1909-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "048cea49-847b-4232-9ece-3656fccc1909" (UID: "048cea49-847b-4232-9ece-3656fccc1909"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.678100 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/048cea49-847b-4232-9ece-3656fccc1909-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "048cea49-847b-4232-9ece-3656fccc1909" (UID: "048cea49-847b-4232-9ece-3656fccc1909"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.770674 4764 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/048cea49-847b-4232-9ece-3656fccc1909-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.770707 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/048cea49-847b-4232-9ece-3656fccc1909-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:34 crc kubenswrapper[4764]: I0309 13:25:34.381956 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"048cea49-847b-4232-9ece-3656fccc1909","Type":"ContainerDied","Data":"22a8a78a0997f7beba52c53801155d1c792168145e66312da717d6dc41ad0310"} Mar 09 13:25:34 crc kubenswrapper[4764]: I0309 13:25:34.381999 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:25:34 crc kubenswrapper[4764]: I0309 13:25:34.382001 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22a8a78a0997f7beba52c53801155d1c792168145e66312da717d6dc41ad0310" Mar 09 13:25:34 crc kubenswrapper[4764]: I0309 13:25:34.382739 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:34 crc kubenswrapper[4764]: I0309 13:25:34.388872 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:34 crc kubenswrapper[4764]: I0309 13:25:34.401712 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" podStartSLOduration=13.401694441 podStartE2EDuration="13.401694441s" podCreationTimestamp="2026-03-09 13:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:34.399162122 +0000 UTC m=+289.649334040" watchObservedRunningTime="2026-03-09 13:25:34.401694441 +0000 UTC m=+289.651866369" Mar 09 13:25:39 crc kubenswrapper[4764]: I0309 13:25:39.216519 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:25:39 crc kubenswrapper[4764]: I0309 13:25:39.216530 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:25:39 crc kubenswrapper[4764]: I0309 13:25:39.217079 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:25:39 crc kubenswrapper[4764]: I0309 13:25:39.217101 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:25:39 crc kubenswrapper[4764]: I0309 13:25:39.427110 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrc8s" event={"ID":"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97","Type":"ContainerStarted","Data":"bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8"} Mar 09 13:25:40 crc kubenswrapper[4764]: I0309 13:25:40.435396 4764 generic.go:334] "Generic (PLEG): container finished" podID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerID="bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8" exitCode=0 Mar 09 13:25:40 crc kubenswrapper[4764]: I0309 13:25:40.435437 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrc8s" event={"ID":"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97","Type":"ContainerDied","Data":"bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8"} Mar 09 13:25:41 crc kubenswrapper[4764]: I0309 13:25:41.444373 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn8f5" event={"ID":"a76121be-d090-4f2a-9e57-1a160a4bb4f2","Type":"ContainerStarted","Data":"71eda28b996e54f946f3b59d46c82f8b6fc0575bacbdbf501c3dc23dbb439d43"} Mar 09 13:25:41 crc kubenswrapper[4764]: I0309 13:25:41.448899 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrc8s" event={"ID":"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97","Type":"ContainerStarted","Data":"0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711"} Mar 09 13:25:41 crc kubenswrapper[4764]: I0309 13:25:41.480273 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nrc8s" podStartSLOduration=3.543738448 podStartE2EDuration="57.480255972s" podCreationTimestamp="2026-03-09 13:24:44 +0000 UTC" firstStartedPulling="2026-03-09 13:24:47.00582228 +0000 UTC m=+242.255994188" lastFinishedPulling="2026-03-09 13:25:40.942339804 +0000 UTC m=+296.192511712" observedRunningTime="2026-03-09 13:25:41.477054834 +0000 UTC m=+296.727226762" watchObservedRunningTime="2026-03-09 13:25:41.480255972 +0000 UTC m=+296.730427880" Mar 09 13:25:41 crc kubenswrapper[4764]: I0309 13:25:41.810599 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f6c57f99f-67zw4"] Mar 09 13:25:41 crc kubenswrapper[4764]: I0309 13:25:41.810828 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" podUID="e4ef7ddb-4b54-48f7-a879-d07cb3222339" containerName="controller-manager" containerID="cri-o://c13a9bf60f9efa3f676a6a2e63469b91bb787e084e0911f7a2d49d61a829e9cf" gracePeriod=30 Mar 09 13:25:41 crc kubenswrapper[4764]: I0309 13:25:41.824164 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj"] Mar 09 13:25:41 crc kubenswrapper[4764]: I0309 13:25:41.824654 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" podUID="556e4318-98c6-4910-830a-2edcafa8c5a3" containerName="route-controller-manager" containerID="cri-o://02fabc10f8ca5b240e9914fe91984e0b3ed4f53f2c9e25d78b0f654f639a110c" gracePeriod=30 Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.476272 4764 generic.go:334] "Generic (PLEG): container finished" podID="556e4318-98c6-4910-830a-2edcafa8c5a3" containerID="02fabc10f8ca5b240e9914fe91984e0b3ed4f53f2c9e25d78b0f654f639a110c" exitCode=0 Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.476375 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" event={"ID":"556e4318-98c6-4910-830a-2edcafa8c5a3","Type":"ContainerDied","Data":"02fabc10f8ca5b240e9914fe91984e0b3ed4f53f2c9e25d78b0f654f639a110c"} Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.480510 4764 generic.go:334] "Generic (PLEG): container finished" podID="e4ef7ddb-4b54-48f7-a879-d07cb3222339" containerID="c13a9bf60f9efa3f676a6a2e63469b91bb787e084e0911f7a2d49d61a829e9cf" exitCode=0 Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.480588 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" event={"ID":"e4ef7ddb-4b54-48f7-a879-d07cb3222339","Type":"ContainerDied","Data":"c13a9bf60f9efa3f676a6a2e63469b91bb787e084e0911f7a2d49d61a829e9cf"} Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.482903 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tll5t" event={"ID":"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6","Type":"ContainerStarted","Data":"0ae328f899ade57662b7a57d61c5864e374b6610f4676d30b76a8c7048f7853c"} Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.486956 4764 generic.go:334] "Generic (PLEG): container finished" podID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerID="71eda28b996e54f946f3b59d46c82f8b6fc0575bacbdbf501c3dc23dbb439d43" exitCode=0 Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.487012 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn8f5" event={"ID":"a76121be-d090-4f2a-9e57-1a160a4bb4f2","Type":"ContainerDied","Data":"71eda28b996e54f946f3b59d46c82f8b6fc0575bacbdbf501c3dc23dbb439d43"} Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.492582 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d627" event={"ID":"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b","Type":"ContainerStarted","Data":"8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2"} Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.757632 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.762626 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.938292 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-config\") pod \"556e4318-98c6-4910-830a-2edcafa8c5a3\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.938404 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqfnq\" (UniqueName: \"kubernetes.io/projected/556e4318-98c6-4910-830a-2edcafa8c5a3-kube-api-access-rqfnq\") pod \"556e4318-98c6-4910-830a-2edcafa8c5a3\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.938442 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-client-ca\") pod \"556e4318-98c6-4910-830a-2edcafa8c5a3\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.938467 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/556e4318-98c6-4910-830a-2edcafa8c5a3-serving-cert\") pod \"556e4318-98c6-4910-830a-2edcafa8c5a3\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.938500 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r96c8\" (UniqueName: \"kubernetes.io/projected/e4ef7ddb-4b54-48f7-a879-d07cb3222339-kube-api-access-r96c8\") pod \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.938552 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-client-ca\") pod \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.938578 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-proxy-ca-bundles\") pod \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.938636 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ef7ddb-4b54-48f7-a879-d07cb3222339-serving-cert\") pod \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.938695 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-config\") pod \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.939286 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-config" (OuterVolumeSpecName: "config") pod "556e4318-98c6-4910-830a-2edcafa8c5a3" (UID: "556e4318-98c6-4910-830a-2edcafa8c5a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.939539 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e4ef7ddb-4b54-48f7-a879-d07cb3222339" (UID: "e4ef7ddb-4b54-48f7-a879-d07cb3222339"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.939557 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-client-ca" (OuterVolumeSpecName: "client-ca") pod "e4ef7ddb-4b54-48f7-a879-d07cb3222339" (UID: "e4ef7ddb-4b54-48f7-a879-d07cb3222339"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.939716 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-config" (OuterVolumeSpecName: "config") pod "e4ef7ddb-4b54-48f7-a879-d07cb3222339" (UID: "e4ef7ddb-4b54-48f7-a879-d07cb3222339"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.940089 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-client-ca" (OuterVolumeSpecName: "client-ca") pod "556e4318-98c6-4910-830a-2edcafa8c5a3" (UID: "556e4318-98c6-4910-830a-2edcafa8c5a3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.950722 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/556e4318-98c6-4910-830a-2edcafa8c5a3-kube-api-access-rqfnq" (OuterVolumeSpecName: "kube-api-access-rqfnq") pod "556e4318-98c6-4910-830a-2edcafa8c5a3" (UID: "556e4318-98c6-4910-830a-2edcafa8c5a3"). InnerVolumeSpecName "kube-api-access-rqfnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.950764 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/556e4318-98c6-4910-830a-2edcafa8c5a3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "556e4318-98c6-4910-830a-2edcafa8c5a3" (UID: "556e4318-98c6-4910-830a-2edcafa8c5a3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.950779 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4ef7ddb-4b54-48f7-a879-d07cb3222339-kube-api-access-r96c8" (OuterVolumeSpecName: "kube-api-access-r96c8") pod "e4ef7ddb-4b54-48f7-a879-d07cb3222339" (UID: "e4ef7ddb-4b54-48f7-a879-d07cb3222339"). InnerVolumeSpecName "kube-api-access-r96c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.962860 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ef7ddb-4b54-48f7-a879-d07cb3222339-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e4ef7ddb-4b54-48f7-a879-d07cb3222339" (UID: "e4ef7ddb-4b54-48f7-a879-d07cb3222339"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.040465 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/556e4318-98c6-4910-830a-2edcafa8c5a3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.040516 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r96c8\" (UniqueName: \"kubernetes.io/projected/e4ef7ddb-4b54-48f7-a879-d07cb3222339-kube-api-access-r96c8\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.040531 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.040543 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.040556 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ef7ddb-4b54-48f7-a879-d07cb3222339-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.040567 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.040579 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.040591 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqfnq\" (UniqueName: \"kubernetes.io/projected/556e4318-98c6-4910-830a-2edcafa8c5a3-kube-api-access-rqfnq\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.040602 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.110999 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz"] Mar 09 13:25:43 crc kubenswrapper[4764]: E0309 13:25:43.111315 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="048cea49-847b-4232-9ece-3656fccc1909" containerName="pruner" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.111330 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="048cea49-847b-4232-9ece-3656fccc1909" containerName="pruner" Mar 09 13:25:43 crc kubenswrapper[4764]: E0309 13:25:43.111359 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ef7ddb-4b54-48f7-a879-d07cb3222339" containerName="controller-manager" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.111367 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ef7ddb-4b54-48f7-a879-d07cb3222339" containerName="controller-manager" Mar 09 13:25:43 crc kubenswrapper[4764]: E0309 13:25:43.111379 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556e4318-98c6-4910-830a-2edcafa8c5a3" containerName="route-controller-manager" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.111388 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="556e4318-98c6-4910-830a-2edcafa8c5a3" containerName="route-controller-manager" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.111508 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="048cea49-847b-4232-9ece-3656fccc1909" containerName="pruner" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.111525 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ef7ddb-4b54-48f7-a879-d07cb3222339" containerName="controller-manager" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.111539 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="556e4318-98c6-4910-830a-2edcafa8c5a3" containerName="route-controller-manager" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.112685 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.116099 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c9d566569-xmnfz"] Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.116842 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.131275 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c9d566569-xmnfz"] Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.135688 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz"] Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.147485 4764 patch_prober.go:28] interesting pod/controller-manager-7f6c57f99f-67zw4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.147545 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" podUID="e4ef7ddb-4b54-48f7-a879-d07cb3222339" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.242603 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32312778-9c44-4843-a588-5fba60384e05-serving-cert\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.242691 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e222771-a709-459f-a36f-e44f4b87983e-serving-cert\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.242736 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l6gj\" (UniqueName: \"kubernetes.io/projected/32312778-9c44-4843-a588-5fba60384e05-kube-api-access-4l6gj\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.242754 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-client-ca\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.242785 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-proxy-ca-bundles\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.242804 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bf7k\" (UniqueName: \"kubernetes.io/projected/8e222771-a709-459f-a36f-e44f4b87983e-kube-api-access-7bf7k\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.242902 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-config\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.242941 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-client-ca\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.243059 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-config\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.344181 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l6gj\" (UniqueName: \"kubernetes.io/projected/32312778-9c44-4843-a588-5fba60384e05-kube-api-access-4l6gj\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.344260 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-client-ca\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.344345 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-proxy-ca-bundles\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.344396 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bf7k\" (UniqueName: \"kubernetes.io/projected/8e222771-a709-459f-a36f-e44f4b87983e-kube-api-access-7bf7k\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.344460 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-config\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.344501 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-client-ca\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.344615 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-config\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.344739 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32312778-9c44-4843-a588-5fba60384e05-serving-cert\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.344842 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e222771-a709-459f-a36f-e44f4b87983e-serving-cert\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.347238 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-client-ca\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.348231 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-config\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.348971 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-client-ca\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.349461 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-config\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.350851 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-proxy-ca-bundles\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.350892 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e222771-a709-459f-a36f-e44f4b87983e-serving-cert\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.356812 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32312778-9c44-4843-a588-5fba60384e05-serving-cert\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.368151 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bf7k\" (UniqueName: \"kubernetes.io/projected/8e222771-a709-459f-a36f-e44f4b87983e-kube-api-access-7bf7k\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.397480 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l6gj\" (UniqueName: \"kubernetes.io/projected/32312778-9c44-4843-a588-5fba60384e05-kube-api-access-4l6gj\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.432191 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.463601 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.507740 4764 generic.go:334] "Generic (PLEG): container finished" podID="20acdcb5-ea78-435e-b472-e102d5553c75" containerID="67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615" exitCode=0 Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.507821 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbm5b" event={"ID":"20acdcb5-ea78-435e-b472-e102d5553c75","Type":"ContainerDied","Data":"67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615"} Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.514056 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.514267 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" event={"ID":"556e4318-98c6-4910-830a-2edcafa8c5a3","Type":"ContainerDied","Data":"335a472e98db78b509578c161edffb85e2cda67cc112f2084014fb62974f8645"} Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.514341 4764 scope.go:117] "RemoveContainer" containerID="02fabc10f8ca5b240e9914fe91984e0b3ed4f53f2c9e25d78b0f654f639a110c" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.519471 4764 generic.go:334] "Generic (PLEG): container finished" podID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerID="8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2" exitCode=0 Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.519570 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d627" event={"ID":"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b","Type":"ContainerDied","Data":"8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2"} Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.522736 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" event={"ID":"e4ef7ddb-4b54-48f7-a879-d07cb3222339","Type":"ContainerDied","Data":"6dfe527522407abed5825087faeed3aa6722bd0b6c51839e5410a49d638c18ec"} Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.522825 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.567865 4764 scope.go:117] "RemoveContainer" containerID="c13a9bf60f9efa3f676a6a2e63469b91bb787e084e0911f7a2d49d61a829e9cf" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.594846 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn8f5" event={"ID":"a76121be-d090-4f2a-9e57-1a160a4bb4f2","Type":"ContainerStarted","Data":"1d40a7985c63e046f744271b4a5346b0a7b6d6cdf8fbc36bc9abda5ba599778a"} Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.626881 4764 patch_prober.go:28] interesting pod/route-controller-manager-7745768cd9-mrqxj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.626944 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" podUID="556e4318-98c6-4910-830a-2edcafa8c5a3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.655578 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f6c57f99f-67zw4"] Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.659010 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7f6c57f99f-67zw4"] Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.666875 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj"] Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.666931 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj"] Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.720139 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jn8f5" podStartSLOduration=3.850183257 podStartE2EDuration="58.720115779s" podCreationTimestamp="2026-03-09 13:24:45 +0000 UTC" firstStartedPulling="2026-03-09 13:24:48.145948354 +0000 UTC m=+243.396120262" lastFinishedPulling="2026-03-09 13:25:43.015880876 +0000 UTC m=+298.266052784" observedRunningTime="2026-03-09 13:25:43.712049688 +0000 UTC m=+298.962221596" watchObservedRunningTime="2026-03-09 13:25:43.720115779 +0000 UTC m=+298.970287687" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.871205 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz"] Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.972070 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c9d566569-xmnfz"] Mar 09 13:25:43 crc kubenswrapper[4764]: W0309 13:25:43.980031 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e222771_a709_459f_a36f_e44f4b87983e.slice/crio-bc74e94f60e5ca1dd91e48b51c639a01e86dbc1382d15f18a1aac3c35e28f4ad WatchSource:0}: Error finding container bc74e94f60e5ca1dd91e48b51c639a01e86dbc1382d15f18a1aac3c35e28f4ad: Status 404 returned error can't find the container with id bc74e94f60e5ca1dd91e48b51c639a01e86dbc1382d15f18a1aac3c35e28f4ad Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.579910 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9z59" event={"ID":"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2","Type":"ContainerStarted","Data":"b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2"} Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.584266 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" event={"ID":"8e222771-a709-459f-a36f-e44f4b87983e","Type":"ContainerStarted","Data":"2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee"} Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.584398 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" event={"ID":"8e222771-a709-459f-a36f-e44f4b87983e","Type":"ContainerStarted","Data":"bc74e94f60e5ca1dd91e48b51c639a01e86dbc1382d15f18a1aac3c35e28f4ad"} Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.585143 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.586499 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" event={"ID":"32312778-9c44-4843-a588-5fba60384e05","Type":"ContainerStarted","Data":"759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d"} Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.586549 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" event={"ID":"32312778-9c44-4843-a588-5fba60384e05","Type":"ContainerStarted","Data":"6264edd6e509ddf66049653028a6e5a99b8ff3fab370367781f6b4c2c4544a37"} Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.586770 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.629281 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.641225 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" podStartSLOduration=3.641211142 podStartE2EDuration="3.641211142s" podCreationTimestamp="2026-03-09 13:25:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:44.639979578 +0000 UTC m=+299.890151486" watchObservedRunningTime="2026-03-09 13:25:44.641211142 +0000 UTC m=+299.891383050" Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.662760 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" podStartSLOduration=3.662692349 podStartE2EDuration="3.662692349s" podCreationTimestamp="2026-03-09 13:25:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:44.659866302 +0000 UTC m=+299.910038210" watchObservedRunningTime="2026-03-09 13:25:44.662692349 +0000 UTC m=+299.912864257" Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.918619 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.918989 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.954938 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:45 crc kubenswrapper[4764]: I0309 13:25:45.496748 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:25:45 crc kubenswrapper[4764]: I0309 13:25:45.496799 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:25:45 crc kubenswrapper[4764]: I0309 13:25:45.570006 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="556e4318-98c6-4910-830a-2edcafa8c5a3" path="/var/lib/kubelet/pods/556e4318-98c6-4910-830a-2edcafa8c5a3/volumes" Mar 09 13:25:45 crc kubenswrapper[4764]: I0309 13:25:45.570571 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4ef7ddb-4b54-48f7-a879-d07cb3222339" path="/var/lib/kubelet/pods/e4ef7ddb-4b54-48f7-a879-d07cb3222339/volumes" Mar 09 13:25:45 crc kubenswrapper[4764]: I0309 13:25:45.598981 4764 generic.go:334] "Generic (PLEG): container finished" podID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerID="0ae328f899ade57662b7a57d61c5864e374b6610f4676d30b76a8c7048f7853c" exitCode=0 Mar 09 13:25:45 crc kubenswrapper[4764]: I0309 13:25:45.599819 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tll5t" event={"ID":"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6","Type":"ContainerDied","Data":"0ae328f899ade57662b7a57d61c5864e374b6610f4676d30b76a8c7048f7853c"} Mar 09 13:25:45 crc kubenswrapper[4764]: I0309 13:25:45.960543 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:25:46 crc kubenswrapper[4764]: I0309 13:25:46.609838 4764 generic.go:334] "Generic (PLEG): container finished" podID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerID="b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2" exitCode=0 Mar 09 13:25:46 crc kubenswrapper[4764]: I0309 13:25:46.609938 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9z59" event={"ID":"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2","Type":"ContainerDied","Data":"b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2"} Mar 09 13:25:46 crc kubenswrapper[4764]: I0309 13:25:46.833787 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-jn8f5" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerName="registry-server" probeResult="failure" output=< Mar 09 13:25:46 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 09 13:25:46 crc kubenswrapper[4764]: > Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.232386 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-x927s" Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.630527 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d627" event={"ID":"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b","Type":"ContainerStarted","Data":"5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1"} Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.633505 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhs57" event={"ID":"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df","Type":"ContainerStarted","Data":"aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803"} Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.636883 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9z59" event={"ID":"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2","Type":"ContainerStarted","Data":"ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b"} Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.639800 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tll5t" event={"ID":"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6","Type":"ContainerStarted","Data":"4325cf9007df54fa3b2a5bed7103ccce2df4b9e7e47622fe04491c8fac8d1503"} Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.642970 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbm5b" event={"ID":"20acdcb5-ea78-435e-b472-e102d5553c75","Type":"ContainerStarted","Data":"b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc"} Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.644959 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551044-p748f" event={"ID":"0a005f65-920a-4cdd-b4da-a270953113aa","Type":"ContainerStarted","Data":"3ce66a9ae238a55280ff899673763223d028dea40122d545386701984ba576ae"} Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.646985 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7k9k" event={"ID":"7a967c79-e11e-4c58-b42e-652d1406ac88","Type":"ContainerStarted","Data":"5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861"} Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.656459 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8d627" podStartSLOduration=3.61726369 podStartE2EDuration="1m5.656439049s" podCreationTimestamp="2026-03-09 13:24:44 +0000 UTC" firstStartedPulling="2026-03-09 13:24:47.022442687 +0000 UTC m=+242.272614595" lastFinishedPulling="2026-03-09 13:25:49.061618026 +0000 UTC m=+304.311789954" observedRunningTime="2026-03-09 13:25:49.654048233 +0000 UTC m=+304.904220131" watchObservedRunningTime="2026-03-09 13:25:49.656439049 +0000 UTC m=+304.906610957" Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.674847 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d9z59" podStartSLOduration=3.335646016 podStartE2EDuration="1m1.674824261s" podCreationTimestamp="2026-03-09 13:24:48 +0000 UTC" firstStartedPulling="2026-03-09 13:24:50.842840003 +0000 UTC m=+246.093011911" lastFinishedPulling="2026-03-09 13:25:49.182018248 +0000 UTC m=+304.432190156" observedRunningTime="2026-03-09 13:25:49.673367121 +0000 UTC m=+304.923539039" watchObservedRunningTime="2026-03-09 13:25:49.674824261 +0000 UTC m=+304.924996169" Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.690018 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551044-p748f" podStartSLOduration=40.290774396 podStartE2EDuration="1m49.690002196s" podCreationTimestamp="2026-03-09 13:24:00 +0000 UTC" firstStartedPulling="2026-03-09 13:24:39.65230508 +0000 UTC m=+234.902476988" lastFinishedPulling="2026-03-09 13:25:49.05153288 +0000 UTC m=+304.301704788" observedRunningTime="2026-03-09 13:25:49.686507101 +0000 UTC m=+304.936679009" watchObservedRunningTime="2026-03-09 13:25:49.690002196 +0000 UTC m=+304.940174104" Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.713742 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mbm5b" podStartSLOduration=9.516713179 podStartE2EDuration="1m5.713722505s" podCreationTimestamp="2026-03-09 13:24:44 +0000 UTC" firstStartedPulling="2026-03-09 13:24:48.395613897 +0000 UTC m=+243.645785805" lastFinishedPulling="2026-03-09 13:25:44.592623223 +0000 UTC m=+299.842795131" observedRunningTime="2026-03-09 13:25:49.708566444 +0000 UTC m=+304.958738342" watchObservedRunningTime="2026-03-09 13:25:49.713722505 +0000 UTC m=+304.963894433" Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.741084 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tll5t" podStartSLOduration=4.371014844 podStartE2EDuration="1m2.741057002s" podCreationTimestamp="2026-03-09 13:24:47 +0000 UTC" firstStartedPulling="2026-03-09 13:24:50.842313299 +0000 UTC m=+246.092485207" lastFinishedPulling="2026-03-09 13:25:49.212355457 +0000 UTC m=+304.462527365" observedRunningTime="2026-03-09 13:25:49.739178981 +0000 UTC m=+304.989350889" watchObservedRunningTime="2026-03-09 13:25:49.741057002 +0000 UTC m=+304.991228910" Mar 09 13:25:50 crc kubenswrapper[4764]: I0309 13:25:50.226280 4764 csr.go:261] certificate signing request csr-dbwk4 is approved, waiting to be issued Mar 09 13:25:50 crc kubenswrapper[4764]: I0309 13:25:50.234781 4764 csr.go:257] certificate signing request csr-dbwk4 is issued Mar 09 13:25:50 crc kubenswrapper[4764]: I0309 13:25:50.655029 4764 generic.go:334] "Generic (PLEG): container finished" podID="0a005f65-920a-4cdd-b4da-a270953113aa" containerID="3ce66a9ae238a55280ff899673763223d028dea40122d545386701984ba576ae" exitCode=0 Mar 09 13:25:50 crc kubenswrapper[4764]: I0309 13:25:50.655117 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551044-p748f" event={"ID":"0a005f65-920a-4cdd-b4da-a270953113aa","Type":"ContainerDied","Data":"3ce66a9ae238a55280ff899673763223d028dea40122d545386701984ba576ae"} Mar 09 13:25:50 crc kubenswrapper[4764]: I0309 13:25:50.657388 4764 generic.go:334] "Generic (PLEG): container finished" podID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerID="5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861" exitCode=0 Mar 09 13:25:50 crc kubenswrapper[4764]: I0309 13:25:50.657441 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7k9k" event={"ID":"7a967c79-e11e-4c58-b42e-652d1406ac88","Type":"ContainerDied","Data":"5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861"} Mar 09 13:25:50 crc kubenswrapper[4764]: I0309 13:25:50.659513 4764 generic.go:334] "Generic (PLEG): container finished" podID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerID="aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803" exitCode=0 Mar 09 13:25:50 crc kubenswrapper[4764]: I0309 13:25:50.659550 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhs57" event={"ID":"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df","Type":"ContainerDied","Data":"aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803"} Mar 09 13:25:51 crc kubenswrapper[4764]: I0309 13:25:51.237461 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-24 09:03:09.417637625 +0000 UTC Mar 09 13:25:51 crc kubenswrapper[4764]: I0309 13:25:51.237518 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6235h37m18.18012371s for next certificate rotation Mar 09 13:25:52 crc kubenswrapper[4764]: I0309 13:25:52.077027 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551044-p748f" Mar 09 13:25:52 crc kubenswrapper[4764]: I0309 13:25:52.200638 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g47h8\" (UniqueName: \"kubernetes.io/projected/0a005f65-920a-4cdd-b4da-a270953113aa-kube-api-access-g47h8\") pod \"0a005f65-920a-4cdd-b4da-a270953113aa\" (UID: \"0a005f65-920a-4cdd-b4da-a270953113aa\") " Mar 09 13:25:52 crc kubenswrapper[4764]: I0309 13:25:52.207914 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a005f65-920a-4cdd-b4da-a270953113aa-kube-api-access-g47h8" (OuterVolumeSpecName: "kube-api-access-g47h8") pod "0a005f65-920a-4cdd-b4da-a270953113aa" (UID: "0a005f65-920a-4cdd-b4da-a270953113aa"). InnerVolumeSpecName "kube-api-access-g47h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:52 crc kubenswrapper[4764]: I0309 13:25:52.238176 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-18 06:20:17.031616075 +0000 UTC Mar 09 13:25:52 crc kubenswrapper[4764]: I0309 13:25:52.238225 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7552h54m24.793393641s for next certificate rotation Mar 09 13:25:52 crc kubenswrapper[4764]: I0309 13:25:52.302440 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g47h8\" (UniqueName: \"kubernetes.io/projected/0a005f65-920a-4cdd-b4da-a270953113aa-kube-api-access-g47h8\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:52 crc kubenswrapper[4764]: I0309 13:25:52.674429 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551044-p748f" event={"ID":"0a005f65-920a-4cdd-b4da-a270953113aa","Type":"ContainerDied","Data":"0090a60e7aca5b3b5065eab933849dd34829a2b082555fb9f3ff31c4a933640e"} Mar 09 13:25:52 crc kubenswrapper[4764]: I0309 13:25:52.674479 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0090a60e7aca5b3b5065eab933849dd34829a2b082555fb9f3ff31c4a933640e" Mar 09 13:25:52 crc kubenswrapper[4764]: I0309 13:25:52.674535 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551044-p748f" Mar 09 13:25:54 crc kubenswrapper[4764]: I0309 13:25:54.702149 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7k9k" event={"ID":"7a967c79-e11e-4c58-b42e-652d1406ac88","Type":"ContainerStarted","Data":"018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a"} Mar 09 13:25:54 crc kubenswrapper[4764]: I0309 13:25:54.726600 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g7k9k" podStartSLOduration=3.546473055 podStartE2EDuration="1m7.726584806s" podCreationTimestamp="2026-03-09 13:24:47 +0000 UTC" firstStartedPulling="2026-03-09 13:24:49.552451749 +0000 UTC m=+244.802623657" lastFinishedPulling="2026-03-09 13:25:53.7325635 +0000 UTC m=+308.982735408" observedRunningTime="2026-03-09 13:25:54.724565411 +0000 UTC m=+309.974737339" watchObservedRunningTime="2026-03-09 13:25:54.726584806 +0000 UTC m=+309.976756714" Mar 09 13:25:54 crc kubenswrapper[4764]: I0309 13:25:54.980665 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.047947 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.048174 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.112087 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.276747 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.276857 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.331864 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.551792 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.596995 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.709037 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhs57" event={"ID":"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df","Type":"ContainerStarted","Data":"d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214"} Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.754337 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.761241 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.780355 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qhs57" podStartSLOduration=3.800128888 podStartE2EDuration="1m9.780338706s" podCreationTimestamp="2026-03-09 13:24:46 +0000 UTC" firstStartedPulling="2026-03-09 13:24:48.460555203 +0000 UTC m=+243.710727111" lastFinishedPulling="2026-03-09 13:25:54.440765021 +0000 UTC m=+309.690936929" observedRunningTime="2026-03-09 13:25:55.728870199 +0000 UTC m=+310.979042117" watchObservedRunningTime="2026-03-09 13:25:55.780338706 +0000 UTC m=+311.030510614" Mar 09 13:25:57 crc kubenswrapper[4764]: I0309 13:25:57.157908 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:25:57 crc kubenswrapper[4764]: I0309 13:25:57.157987 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:25:57 crc kubenswrapper[4764]: I0309 13:25:57.211825 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:25:57 crc kubenswrapper[4764]: I0309 13:25:57.258853 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mbm5b"] Mar 09 13:25:57 crc kubenswrapper[4764]: I0309 13:25:57.603373 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:25:57 crc kubenswrapper[4764]: I0309 13:25:57.603442 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:25:57 crc kubenswrapper[4764]: I0309 13:25:57.655215 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:25:57 crc kubenswrapper[4764]: I0309 13:25:57.849044 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nj856"] Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.258592 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jn8f5"] Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.259082 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jn8f5" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerName="registry-server" containerID="cri-o://1d40a7985c63e046f744271b4a5346b0a7b6d6cdf8fbc36bc9abda5ba599778a" gracePeriod=2 Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.369917 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.369984 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.370039 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.370674 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.370732 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad" gracePeriod=600 Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.556559 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.607088 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.617992 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.731901 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad" exitCode=0 Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.731983 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad"} Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.732019 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"8bb39ae24112881ead01c36905f25d69cb36d1e4d3c6e8aa79f283e7dd0d444b"} Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.739274 4764 generic.go:334] "Generic (PLEG): container finished" podID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerID="1d40a7985c63e046f744271b4a5346b0a7b6d6cdf8fbc36bc9abda5ba599778a" exitCode=0 Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.739301 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn8f5" event={"ID":"a76121be-d090-4f2a-9e57-1a160a4bb4f2","Type":"ContainerDied","Data":"1d40a7985c63e046f744271b4a5346b0a7b6d6cdf8fbc36bc9abda5ba599778a"} Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.739771 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mbm5b" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" containerName="registry-server" containerID="cri-o://b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc" gracePeriod=2 Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.764225 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.765247 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.822374 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.838189 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.876113 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.889847 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc5m2\" (UniqueName: \"kubernetes.io/projected/a76121be-d090-4f2a-9e57-1a160a4bb4f2-kube-api-access-dc5m2\") pod \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.889906 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-utilities\") pod \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.889944 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-catalog-content\") pod \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.897404 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-utilities" (OuterVolumeSpecName: "utilities") pod "a76121be-d090-4f2a-9e57-1a160a4bb4f2" (UID: "a76121be-d090-4f2a-9e57-1a160a4bb4f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.900729 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a76121be-d090-4f2a-9e57-1a160a4bb4f2-kube-api-access-dc5m2" (OuterVolumeSpecName: "kube-api-access-dc5m2") pod "a76121be-d090-4f2a-9e57-1a160a4bb4f2" (UID: "a76121be-d090-4f2a-9e57-1a160a4bb4f2"). InnerVolumeSpecName "kube-api-access-dc5m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.954303 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a76121be-d090-4f2a-9e57-1a160a4bb4f2" (UID: "a76121be-d090-4f2a-9e57-1a160a4bb4f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.991469 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc5m2\" (UniqueName: \"kubernetes.io/projected/a76121be-d090-4f2a-9e57-1a160a4bb4f2-kube-api-access-dc5m2\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.991756 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.991770 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.179234 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.193552 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-utilities\") pod \"20acdcb5-ea78-435e-b472-e102d5553c75\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.193623 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4s9m\" (UniqueName: \"kubernetes.io/projected/20acdcb5-ea78-435e-b472-e102d5553c75-kube-api-access-f4s9m\") pod \"20acdcb5-ea78-435e-b472-e102d5553c75\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.193658 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-catalog-content\") pod \"20acdcb5-ea78-435e-b472-e102d5553c75\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.194376 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-utilities" (OuterVolumeSpecName: "utilities") pod "20acdcb5-ea78-435e-b472-e102d5553c75" (UID: "20acdcb5-ea78-435e-b472-e102d5553c75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.198882 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20acdcb5-ea78-435e-b472-e102d5553c75-kube-api-access-f4s9m" (OuterVolumeSpecName: "kube-api-access-f4s9m") pod "20acdcb5-ea78-435e-b472-e102d5553c75" (UID: "20acdcb5-ea78-435e-b472-e102d5553c75"). InnerVolumeSpecName "kube-api-access-f4s9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.262132 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20acdcb5-ea78-435e-b472-e102d5553c75" (UID: "20acdcb5-ea78-435e-b472-e102d5553c75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.295340 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.295384 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4s9m\" (UniqueName: \"kubernetes.io/projected/20acdcb5-ea78-435e-b472-e102d5553c75-kube-api-access-f4s9m\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.295395 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.749802 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.749832 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn8f5" event={"ID":"a76121be-d090-4f2a-9e57-1a160a4bb4f2","Type":"ContainerDied","Data":"a07c170a29ea8bcf9be266201f1dd0580d7bdb690c3b989b62809138bb677d6e"} Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.750258 4764 scope.go:117] "RemoveContainer" containerID="1d40a7985c63e046f744271b4a5346b0a7b6d6cdf8fbc36bc9abda5ba599778a" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.754541 4764 generic.go:334] "Generic (PLEG): container finished" podID="20acdcb5-ea78-435e-b472-e102d5553c75" containerID="b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc" exitCode=0 Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.754626 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.754695 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbm5b" event={"ID":"20acdcb5-ea78-435e-b472-e102d5553c75","Type":"ContainerDied","Data":"b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc"} Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.754761 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbm5b" event={"ID":"20acdcb5-ea78-435e-b472-e102d5553c75","Type":"ContainerDied","Data":"36b8a908fc96eec5fd19468146038ec7f847f96484b3a606a41defe1a23a894e"} Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.773238 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jn8f5"] Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.773419 4764 scope.go:117] "RemoveContainer" containerID="71eda28b996e54f946f3b59d46c82f8b6fc0575bacbdbf501c3dc23dbb439d43" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.785431 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jn8f5"] Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.797629 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mbm5b"] Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.800356 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mbm5b"] Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.804002 4764 scope.go:117] "RemoveContainer" containerID="851e5f42b5e692a4d7bb4a0fda84945ae1aa93c4dc2838b29856d0c9ed98624f" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.819109 4764 scope.go:117] "RemoveContainer" containerID="b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.823806 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.836779 4764 scope.go:117] "RemoveContainer" containerID="67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.856124 4764 scope.go:117] "RemoveContainer" containerID="8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.869759 4764 scope.go:117] "RemoveContainer" containerID="b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc" Mar 09 13:25:59 crc kubenswrapper[4764]: E0309 13:25:59.870022 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc\": container with ID starting with b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc not found: ID does not exist" containerID="b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.870056 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc"} err="failed to get container status \"b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc\": rpc error: code = NotFound desc = could not find container \"b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc\": container with ID starting with b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc not found: ID does not exist" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.870081 4764 scope.go:117] "RemoveContainer" containerID="67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615" Mar 09 13:25:59 crc kubenswrapper[4764]: E0309 13:25:59.877155 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615\": container with ID starting with 67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615 not found: ID does not exist" containerID="67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.877960 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615"} err="failed to get container status \"67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615\": rpc error: code = NotFound desc = could not find container \"67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615\": container with ID starting with 67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615 not found: ID does not exist" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.878083 4764 scope.go:117] "RemoveContainer" containerID="8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1" Mar 09 13:25:59 crc kubenswrapper[4764]: E0309 13:25:59.878622 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1\": container with ID starting with 8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1 not found: ID does not exist" containerID="8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.878668 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1"} err="failed to get container status \"8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1\": rpc error: code = NotFound desc = could not find container \"8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1\": container with ID starting with 8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1 not found: ID does not exist" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.139009 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551046-ll87d"] Mar 09 13:26:00 crc kubenswrapper[4764]: E0309 13:26:00.139486 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" containerName="extract-content" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.139563 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" containerName="extract-content" Mar 09 13:26:00 crc kubenswrapper[4764]: E0309 13:26:00.139628 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerName="extract-utilities" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.139719 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerName="extract-utilities" Mar 09 13:26:00 crc kubenswrapper[4764]: E0309 13:26:00.139791 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerName="registry-server" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.139846 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerName="registry-server" Mar 09 13:26:00 crc kubenswrapper[4764]: E0309 13:26:00.139899 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" containerName="registry-server" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.140038 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" containerName="registry-server" Mar 09 13:26:00 crc kubenswrapper[4764]: E0309 13:26:00.140103 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" containerName="extract-utilities" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.140155 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" containerName="extract-utilities" Mar 09 13:26:00 crc kubenswrapper[4764]: E0309 13:26:00.140208 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerName="extract-content" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.140258 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerName="extract-content" Mar 09 13:26:00 crc kubenswrapper[4764]: E0309 13:26:00.140314 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a005f65-920a-4cdd-b4da-a270953113aa" containerName="oc" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.140375 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a005f65-920a-4cdd-b4da-a270953113aa" containerName="oc" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.140538 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" containerName="registry-server" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.140597 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a005f65-920a-4cdd-b4da-a270953113aa" containerName="oc" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.140683 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerName="registry-server" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.141153 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551046-ll87d" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.143455 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.143695 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.144914 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.148197 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551046-ll87d"] Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.208879 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msj62\" (UniqueName: \"kubernetes.io/projected/c09230f9-b117-44a0-b3ed-ab6dc7ce0285-kube-api-access-msj62\") pod \"auto-csr-approver-29551046-ll87d\" (UID: \"c09230f9-b117-44a0-b3ed-ab6dc7ce0285\") " pod="openshift-infra/auto-csr-approver-29551046-ll87d" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.310525 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msj62\" (UniqueName: \"kubernetes.io/projected/c09230f9-b117-44a0-b3ed-ab6dc7ce0285-kube-api-access-msj62\") pod \"auto-csr-approver-29551046-ll87d\" (UID: \"c09230f9-b117-44a0-b3ed-ab6dc7ce0285\") " pod="openshift-infra/auto-csr-approver-29551046-ll87d" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.345855 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msj62\" (UniqueName: \"kubernetes.io/projected/c09230f9-b117-44a0-b3ed-ab6dc7ce0285-kube-api-access-msj62\") pod \"auto-csr-approver-29551046-ll87d\" (UID: \"c09230f9-b117-44a0-b3ed-ab6dc7ce0285\") " pod="openshift-infra/auto-csr-approver-29551046-ll87d" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.460189 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551046-ll87d" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.869341 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551046-ll87d"] Mar 09 13:26:01 crc kubenswrapper[4764]: I0309 13:26:01.569151 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" path="/var/lib/kubelet/pods/20acdcb5-ea78-435e-b472-e102d5553c75/volumes" Mar 09 13:26:01 crc kubenswrapper[4764]: I0309 13:26:01.569945 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" path="/var/lib/kubelet/pods/a76121be-d090-4f2a-9e57-1a160a4bb4f2/volumes" Mar 09 13:26:01 crc kubenswrapper[4764]: I0309 13:26:01.769813 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551046-ll87d" event={"ID":"c09230f9-b117-44a0-b3ed-ab6dc7ce0285","Type":"ContainerStarted","Data":"f8624d20b131f960061486b5cf6a38954f017f1461bb66469047b83c408d5f71"} Mar 09 13:26:01 crc kubenswrapper[4764]: I0309 13:26:01.782946 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c9d566569-xmnfz"] Mar 09 13:26:01 crc kubenswrapper[4764]: I0309 13:26:01.783223 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" podUID="8e222771-a709-459f-a36f-e44f4b87983e" containerName="controller-manager" containerID="cri-o://2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee" gracePeriod=30 Mar 09 13:26:01 crc kubenswrapper[4764]: I0309 13:26:01.888219 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz"] Mar 09 13:26:01 crc kubenswrapper[4764]: I0309 13:26:01.888494 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" podUID="32312778-9c44-4843-a588-5fba60384e05" containerName="route-controller-manager" containerID="cri-o://759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d" gracePeriod=30 Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.397766 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.401773 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.439084 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-client-ca\") pod \"8e222771-a709-459f-a36f-e44f4b87983e\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.439134 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32312778-9c44-4843-a588-5fba60384e05-serving-cert\") pod \"32312778-9c44-4843-a588-5fba60384e05\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.439189 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l6gj\" (UniqueName: \"kubernetes.io/projected/32312778-9c44-4843-a588-5fba60384e05-kube-api-access-4l6gj\") pod \"32312778-9c44-4843-a588-5fba60384e05\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.439219 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-config\") pod \"32312778-9c44-4843-a588-5fba60384e05\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.439242 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-proxy-ca-bundles\") pod \"8e222771-a709-459f-a36f-e44f4b87983e\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.439266 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e222771-a709-459f-a36f-e44f4b87983e-serving-cert\") pod \"8e222771-a709-459f-a36f-e44f4b87983e\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.439290 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-client-ca\") pod \"32312778-9c44-4843-a588-5fba60384e05\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.439319 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bf7k\" (UniqueName: \"kubernetes.io/projected/8e222771-a709-459f-a36f-e44f4b87983e-kube-api-access-7bf7k\") pod \"8e222771-a709-459f-a36f-e44f4b87983e\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.439359 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-config\") pod \"8e222771-a709-459f-a36f-e44f4b87983e\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.440005 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8e222771-a709-459f-a36f-e44f4b87983e" (UID: "8e222771-a709-459f-a36f-e44f4b87983e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.440009 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-config" (OuterVolumeSpecName: "config") pod "32312778-9c44-4843-a588-5fba60384e05" (UID: "32312778-9c44-4843-a588-5fba60384e05"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.440173 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-client-ca" (OuterVolumeSpecName: "client-ca") pod "32312778-9c44-4843-a588-5fba60384e05" (UID: "32312778-9c44-4843-a588-5fba60384e05"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.440245 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-client-ca" (OuterVolumeSpecName: "client-ca") pod "8e222771-a709-459f-a36f-e44f4b87983e" (UID: "8e222771-a709-459f-a36f-e44f4b87983e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.441557 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-config" (OuterVolumeSpecName: "config") pod "8e222771-a709-459f-a36f-e44f4b87983e" (UID: "8e222771-a709-459f-a36f-e44f4b87983e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.448884 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e222771-a709-459f-a36f-e44f4b87983e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8e222771-a709-459f-a36f-e44f4b87983e" (UID: "8e222771-a709-459f-a36f-e44f4b87983e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.449779 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32312778-9c44-4843-a588-5fba60384e05-kube-api-access-4l6gj" (OuterVolumeSpecName: "kube-api-access-4l6gj") pod "32312778-9c44-4843-a588-5fba60384e05" (UID: "32312778-9c44-4843-a588-5fba60384e05"). InnerVolumeSpecName "kube-api-access-4l6gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.451788 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32312778-9c44-4843-a588-5fba60384e05-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "32312778-9c44-4843-a588-5fba60384e05" (UID: "32312778-9c44-4843-a588-5fba60384e05"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.451820 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e222771-a709-459f-a36f-e44f4b87983e-kube-api-access-7bf7k" (OuterVolumeSpecName: "kube-api-access-7bf7k") pod "8e222771-a709-459f-a36f-e44f4b87983e" (UID: "8e222771-a709-459f-a36f-e44f4b87983e"). InnerVolumeSpecName "kube-api-access-7bf7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.540122 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.540156 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bf7k\" (UniqueName: \"kubernetes.io/projected/8e222771-a709-459f-a36f-e44f4b87983e-kube-api-access-7bf7k\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.540168 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.540179 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.540191 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32312778-9c44-4843-a588-5fba60384e05-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.540200 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l6gj\" (UniqueName: \"kubernetes.io/projected/32312778-9c44-4843-a588-5fba60384e05-kube-api-access-4l6gj\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.540210 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.540220 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.540230 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e222771-a709-459f-a36f-e44f4b87983e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.660236 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d9z59"] Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.776973 4764 generic.go:334] "Generic (PLEG): container finished" podID="c09230f9-b117-44a0-b3ed-ab6dc7ce0285" containerID="bde3db10cbf6b95804c8c69e0b70c34f476ab92c998e4a7ae6079e0e0e9d7b98" exitCode=0 Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.777089 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551046-ll87d" event={"ID":"c09230f9-b117-44a0-b3ed-ab6dc7ce0285","Type":"ContainerDied","Data":"bde3db10cbf6b95804c8c69e0b70c34f476ab92c998e4a7ae6079e0e0e9d7b98"} Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.778999 4764 generic.go:334] "Generic (PLEG): container finished" podID="8e222771-a709-459f-a36f-e44f4b87983e" containerID="2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee" exitCode=0 Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.779052 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" event={"ID":"8e222771-a709-459f-a36f-e44f4b87983e","Type":"ContainerDied","Data":"2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee"} Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.779090 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" event={"ID":"8e222771-a709-459f-a36f-e44f4b87983e","Type":"ContainerDied","Data":"bc74e94f60e5ca1dd91e48b51c639a01e86dbc1382d15f18a1aac3c35e28f4ad"} Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.779078 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.779117 4764 scope.go:117] "RemoveContainer" containerID="2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.780805 4764 generic.go:334] "Generic (PLEG): container finished" podID="32312778-9c44-4843-a588-5fba60384e05" containerID="759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d" exitCode=0 Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.781052 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d9z59" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerName="registry-server" containerID="cri-o://ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b" gracePeriod=2 Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.781435 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.782204 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" event={"ID":"32312778-9c44-4843-a588-5fba60384e05","Type":"ContainerDied","Data":"759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d"} Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.782264 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" event={"ID":"32312778-9c44-4843-a588-5fba60384e05","Type":"ContainerDied","Data":"6264edd6e509ddf66049653028a6e5a99b8ff3fab370367781f6b4c2c4544a37"} Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.800657 4764 scope.go:117] "RemoveContainer" containerID="2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee" Mar 09 13:26:02 crc kubenswrapper[4764]: E0309 13:26:02.801016 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee\": container with ID starting with 2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee not found: ID does not exist" containerID="2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.801048 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee"} err="failed to get container status \"2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee\": rpc error: code = NotFound desc = could not find container \"2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee\": container with ID starting with 2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee not found: ID does not exist" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.801077 4764 scope.go:117] "RemoveContainer" containerID="759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.823732 4764 scope.go:117] "RemoveContainer" containerID="759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.825751 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz"] Mar 09 13:26:02 crc kubenswrapper[4764]: E0309 13:26:02.825958 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d\": container with ID starting with 759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d not found: ID does not exist" containerID="759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.826047 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d"} err="failed to get container status \"759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d\": rpc error: code = NotFound desc = could not find container \"759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d\": container with ID starting with 759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d not found: ID does not exist" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.831266 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz"] Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.835577 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c9d566569-xmnfz"] Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.839430 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7c9d566569-xmnfz"] Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.138741 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-549f4d59df-4zh55"] Mar 09 13:26:03 crc kubenswrapper[4764]: E0309 13:26:03.139010 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32312778-9c44-4843-a588-5fba60384e05" containerName="route-controller-manager" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.139038 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="32312778-9c44-4843-a588-5fba60384e05" containerName="route-controller-manager" Mar 09 13:26:03 crc kubenswrapper[4764]: E0309 13:26:03.139064 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e222771-a709-459f-a36f-e44f4b87983e" containerName="controller-manager" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.139074 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e222771-a709-459f-a36f-e44f4b87983e" containerName="controller-manager" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.139198 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e222771-a709-459f-a36f-e44f4b87983e" containerName="controller-manager" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.139222 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="32312778-9c44-4843-a588-5fba60384e05" containerName="route-controller-manager" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.139688 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6"] Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.139965 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.140313 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.142597 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.144004 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.144042 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.144112 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.144141 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.144208 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.144369 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.144583 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.144763 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.144905 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.145396 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.146105 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.147556 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76db3a34-f290-4c40-892c-f22642bae846-config\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.147612 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76db3a34-f290-4c40-892c-f22642bae846-proxy-ca-bundles\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.147674 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76db3a34-f290-4c40-892c-f22642bae846-serving-cert\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.147725 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnsmm\" (UniqueName: \"kubernetes.io/projected/037afebc-2339-4b7b-ba28-45bd9d6e949e-kube-api-access-pnsmm\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.147765 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/037afebc-2339-4b7b-ba28-45bd9d6e949e-config\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.147788 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7mw2\" (UniqueName: \"kubernetes.io/projected/76db3a34-f290-4c40-892c-f22642bae846-kube-api-access-v7mw2\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.147803 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/037afebc-2339-4b7b-ba28-45bd9d6e949e-serving-cert\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.147824 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/037afebc-2339-4b7b-ba28-45bd9d6e949e-client-ca\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.147843 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76db3a34-f290-4c40-892c-f22642bae846-client-ca\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.152600 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.175074 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6"] Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.181517 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.197882 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-549f4d59df-4zh55"] Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.248780 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-catalog-content\") pod \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.248878 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-utilities\") pod \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.248997 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76db3a34-f290-4c40-892c-f22642bae846-config\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.249068 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76db3a34-f290-4c40-892c-f22642bae846-proxy-ca-bundles\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.249132 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76db3a34-f290-4c40-892c-f22642bae846-serving-cert\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.249183 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnsmm\" (UniqueName: \"kubernetes.io/projected/037afebc-2339-4b7b-ba28-45bd9d6e949e-kube-api-access-pnsmm\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.249235 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/037afebc-2339-4b7b-ba28-45bd9d6e949e-config\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.249268 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7mw2\" (UniqueName: \"kubernetes.io/projected/76db3a34-f290-4c40-892c-f22642bae846-kube-api-access-v7mw2\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.249298 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/037afebc-2339-4b7b-ba28-45bd9d6e949e-serving-cert\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.249340 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/037afebc-2339-4b7b-ba28-45bd9d6e949e-client-ca\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.249373 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76db3a34-f290-4c40-892c-f22642bae846-client-ca\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.251199 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76db3a34-f290-4c40-892c-f22642bae846-proxy-ca-bundles\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.251418 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/037afebc-2339-4b7b-ba28-45bd9d6e949e-client-ca\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.251557 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/037afebc-2339-4b7b-ba28-45bd9d6e949e-config\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.251613 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76db3a34-f290-4c40-892c-f22642bae846-config\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.251894 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76db3a34-f290-4c40-892c-f22642bae846-client-ca\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.252782 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-utilities" (OuterVolumeSpecName: "utilities") pod "c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" (UID: "c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.255532 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/037afebc-2339-4b7b-ba28-45bd9d6e949e-serving-cert\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.255826 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76db3a34-f290-4c40-892c-f22642bae846-serving-cert\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.267013 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7mw2\" (UniqueName: \"kubernetes.io/projected/76db3a34-f290-4c40-892c-f22642bae846-kube-api-access-v7mw2\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.270321 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnsmm\" (UniqueName: \"kubernetes.io/projected/037afebc-2339-4b7b-ba28-45bd9d6e949e-kube-api-access-pnsmm\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.349874 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stkxn\" (UniqueName: \"kubernetes.io/projected/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-kube-api-access-stkxn\") pod \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.350628 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.353406 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-kube-api-access-stkxn" (OuterVolumeSpecName: "kube-api-access-stkxn") pod "c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" (UID: "c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2"). InnerVolumeSpecName "kube-api-access-stkxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.374310 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" (UID: "c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.451174 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.451439 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stkxn\" (UniqueName: \"kubernetes.io/projected/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-kube-api-access-stkxn\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.490465 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.498439 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.585314 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32312778-9c44-4843-a588-5fba60384e05" path="/var/lib/kubelet/pods/32312778-9c44-4843-a588-5fba60384e05/volumes" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.590384 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e222771-a709-459f-a36f-e44f4b87983e" path="/var/lib/kubelet/pods/8e222771-a709-459f-a36f-e44f4b87983e/volumes" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.789396 4764 generic.go:334] "Generic (PLEG): container finished" podID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerID="ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b" exitCode=0 Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.789500 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9z59" event={"ID":"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2","Type":"ContainerDied","Data":"ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b"} Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.789552 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.789593 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9z59" event={"ID":"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2","Type":"ContainerDied","Data":"63639f892c3d7cc35dde0976454fc20f0b0dcd4c9977b4b39ee9f80a34190631"} Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.789620 4764 scope.go:117] "RemoveContainer" containerID="ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.809908 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d9z59"] Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.811710 4764 scope.go:117] "RemoveContainer" containerID="b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.812851 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d9z59"] Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.845859 4764 scope.go:117] "RemoveContainer" containerID="d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.859878 4764 scope.go:117] "RemoveContainer" containerID="ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b" Mar 09 13:26:03 crc kubenswrapper[4764]: E0309 13:26:03.860295 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b\": container with ID starting with ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b not found: ID does not exist" containerID="ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.860340 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b"} err="failed to get container status \"ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b\": rpc error: code = NotFound desc = could not find container \"ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b\": container with ID starting with ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b not found: ID does not exist" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.860370 4764 scope.go:117] "RemoveContainer" containerID="b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2" Mar 09 13:26:03 crc kubenswrapper[4764]: E0309 13:26:03.860608 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2\": container with ID starting with b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2 not found: ID does not exist" containerID="b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.860633 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2"} err="failed to get container status \"b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2\": rpc error: code = NotFound desc = could not find container \"b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2\": container with ID starting with b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2 not found: ID does not exist" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.860662 4764 scope.go:117] "RemoveContainer" containerID="d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2" Mar 09 13:26:03 crc kubenswrapper[4764]: E0309 13:26:03.861082 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2\": container with ID starting with d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2 not found: ID does not exist" containerID="d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.861111 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2"} err="failed to get container status \"d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2\": rpc error: code = NotFound desc = could not find container \"d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2\": container with ID starting with d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2 not found: ID does not exist" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.917993 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-549f4d59df-4zh55"] Mar 09 13:26:03 crc kubenswrapper[4764]: W0309 13:26:03.930590 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76db3a34_f290_4c40_892c_f22642bae846.slice/crio-017b5cfb1ffda49d2bd410be18d85e5f0dcd32686c8057ab01f06d5a0893d59b WatchSource:0}: Error finding container 017b5cfb1ffda49d2bd410be18d85e5f0dcd32686c8057ab01f06d5a0893d59b: Status 404 returned error can't find the container with id 017b5cfb1ffda49d2bd410be18d85e5f0dcd32686c8057ab01f06d5a0893d59b Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.969037 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6"] Mar 09 13:26:03 crc kubenswrapper[4764]: W0309 13:26:03.989013 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod037afebc_2339_4b7b_ba28_45bd9d6e949e.slice/crio-76fd845119563bf1fa122b5fc75988041e296874da4420c76b26f371cfa3fd42 WatchSource:0}: Error finding container 76fd845119563bf1fa122b5fc75988041e296874da4420c76b26f371cfa3fd42: Status 404 returned error can't find the container with id 76fd845119563bf1fa122b5fc75988041e296874da4420c76b26f371cfa3fd42 Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.078165 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551046-ll87d" Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.263847 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msj62\" (UniqueName: \"kubernetes.io/projected/c09230f9-b117-44a0-b3ed-ab6dc7ce0285-kube-api-access-msj62\") pod \"c09230f9-b117-44a0-b3ed-ab6dc7ce0285\" (UID: \"c09230f9-b117-44a0-b3ed-ab6dc7ce0285\") " Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.269128 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c09230f9-b117-44a0-b3ed-ab6dc7ce0285-kube-api-access-msj62" (OuterVolumeSpecName: "kube-api-access-msj62") pod "c09230f9-b117-44a0-b3ed-ab6dc7ce0285" (UID: "c09230f9-b117-44a0-b3ed-ab6dc7ce0285"). InnerVolumeSpecName "kube-api-access-msj62". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.365817 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msj62\" (UniqueName: \"kubernetes.io/projected/c09230f9-b117-44a0-b3ed-ab6dc7ce0285-kube-api-access-msj62\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.799192 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551046-ll87d" event={"ID":"c09230f9-b117-44a0-b3ed-ab6dc7ce0285","Type":"ContainerDied","Data":"f8624d20b131f960061486b5cf6a38954f017f1461bb66469047b83c408d5f71"} Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.799255 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8624d20b131f960061486b5cf6a38954f017f1461bb66469047b83c408d5f71" Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.800316 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551046-ll87d" Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.800406 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" event={"ID":"037afebc-2339-4b7b-ba28-45bd9d6e949e","Type":"ContainerStarted","Data":"3a95f0e2fee7cc7194c065f9e519774ef1f7d76dd3ee442ec0d780b7dfa666fe"} Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.800572 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" event={"ID":"037afebc-2339-4b7b-ba28-45bd9d6e949e","Type":"ContainerStarted","Data":"76fd845119563bf1fa122b5fc75988041e296874da4420c76b26f371cfa3fd42"} Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.801851 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" event={"ID":"76db3a34-f290-4c40-892c-f22642bae846","Type":"ContainerStarted","Data":"a2a09a6554fbde267814aaed185f46872ec5b6ad8edd513166ed165681209bb8"} Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.801885 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" event={"ID":"76db3a34-f290-4c40-892c-f22642bae846","Type":"ContainerStarted","Data":"017b5cfb1ffda49d2bd410be18d85e5f0dcd32686c8057ab01f06d5a0893d59b"} Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.802286 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.806452 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.821882 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" podStartSLOduration=3.821863241 podStartE2EDuration="3.821863241s" podCreationTimestamp="2026-03-09 13:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:26:04.818350195 +0000 UTC m=+320.068522103" watchObservedRunningTime="2026-03-09 13:26:04.821863241 +0000 UTC m=+320.072035149" Mar 09 13:26:05 crc kubenswrapper[4764]: I0309 13:26:05.108025 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" podStartSLOduration=4.108001334 podStartE2EDuration="4.108001334s" podCreationTimestamp="2026-03-09 13:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:26:04.84082437 +0000 UTC m=+320.090996288" watchObservedRunningTime="2026-03-09 13:26:05.108001334 +0000 UTC m=+320.358173242" Mar 09 13:26:05 crc kubenswrapper[4764]: I0309 13:26:05.566908 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" path="/var/lib/kubelet/pods/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2/volumes" Mar 09 13:26:05 crc kubenswrapper[4764]: I0309 13:26:05.807498 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:05 crc kubenswrapper[4764]: I0309 13:26:05.813312 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:07 crc kubenswrapper[4764]: I0309 13:26:07.194997 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:26:07 crc kubenswrapper[4764]: I0309 13:26:07.645487 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.199471 4764 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.200002 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerName="registry-server" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200018 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerName="registry-server" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.200036 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerName="extract-utilities" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200044 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerName="extract-utilities" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.200057 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09230f9-b117-44a0-b3ed-ab6dc7ce0285" containerName="oc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200064 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09230f9-b117-44a0-b3ed-ab6dc7ce0285" containerName="oc" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.200081 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerName="extract-content" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200088 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerName="extract-content" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200201 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c09230f9-b117-44a0-b3ed-ab6dc7ce0285" containerName="oc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200221 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerName="registry-server" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200605 4764 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200789 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200916 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7" gracePeriod=15 Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200954 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b" gracePeriod=15 Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200966 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6" gracePeriod=15 Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200926 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811" gracePeriod=15 Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201091 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f" gracePeriod=15 Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201349 4764 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.201472 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201485 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.201492 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201499 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.201509 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201516 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.201522 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201529 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.201539 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201546 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.201556 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201563 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.201575 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201582 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.201591 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201598 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201711 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201722 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201731 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201737 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201746 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201753 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201760 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201769 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.201851 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201857 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.201872 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201879 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201956 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.232810 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.232870 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.232892 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.232911 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.232973 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.233007 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.233023 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.233050 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.259754 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.333906 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.333962 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.333989 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334012 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334033 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334062 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334319 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334397 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334435 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334431 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334457 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334484 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334514 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334487 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334506 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334562 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.553790 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: W0309 13:26:09.578467 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-2206d89c112cb493c431c0955e1d8f5a2852de0b3891643d63061de8ac6c7054 WatchSource:0}: Error finding container 2206d89c112cb493c431c0955e1d8f5a2852de0b3891643d63061de8ac6c7054: Status 404 returned error can't find the container with id 2206d89c112cb493c431c0955e1d8f5a2852de0b3891643d63061de8ac6c7054 Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.581322 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.52:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b2f31e152c0df openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:26:09.580622047 +0000 UTC m=+324.830793955,LastTimestamp:2026-03-09 13:26:09.580622047 +0000 UTC m=+324.830793955,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.830759 4764 generic.go:334] "Generic (PLEG): container finished" podID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" containerID="a9112ccd1b21485b0fe1f8f0b1cfb2709637cb4e00bc4c414110531d66291dd4" exitCode=0 Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.830864 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6079e5ed-2acc-42f5-a62e-ea2a98b18abd","Type":"ContainerDied","Data":"a9112ccd1b21485b0fe1f8f0b1cfb2709637cb4e00bc4c414110531d66291dd4"} Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.831809 4764 status_manager.go:851] "Failed to get status for pod" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.832538 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.836042 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.837934 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.838673 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811" exitCode=0 Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.838699 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f" exitCode=0 Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.838709 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b" exitCode=0 Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.838718 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6" exitCode=2 Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.838760 4764 scope.go:117] "RemoveContainer" containerID="033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.842192 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772"} Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.842229 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2206d89c112cb493c431c0955e1d8f5a2852de0b3891643d63061de8ac6c7054"} Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.842936 4764 status_manager.go:851] "Failed to get status for pod" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.843183 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:10 crc kubenswrapper[4764]: E0309 13:26:10.651101 4764 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.129.56.52:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" volumeName="registry-storage" Mar 09 13:26:10 crc kubenswrapper[4764]: E0309 13:26:10.763239 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.52:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b2f31e152c0df openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:26:09.580622047 +0000 UTC m=+324.830793955,LastTimestamp:2026-03-09 13:26:09.580622047 +0000 UTC m=+324.830793955,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:26:10 crc kubenswrapper[4764]: I0309 13:26:10.850787 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.210237 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.211132 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.211492 4764 status_manager.go:851] "Failed to get status for pod" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.370283 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kube-api-access\") pod \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.370397 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-var-lock\") pod \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.370508 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kubelet-dir\") pod \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.370692 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-var-lock" (OuterVolumeSpecName: "var-lock") pod "6079e5ed-2acc-42f5-a62e-ea2a98b18abd" (UID: "6079e5ed-2acc-42f5-a62e-ea2a98b18abd"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.370731 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6079e5ed-2acc-42f5-a62e-ea2a98b18abd" (UID: "6079e5ed-2acc-42f5-a62e-ea2a98b18abd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.370796 4764 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-var-lock\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.370813 4764 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.377393 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6079e5ed-2acc-42f5-a62e-ea2a98b18abd" (UID: "6079e5ed-2acc-42f5-a62e-ea2a98b18abd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.471950 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.580125 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.581687 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.582471 4764 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.583026 4764 status_manager.go:851] "Failed to get status for pod" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.583375 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.776058 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.776222 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.776207 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.776306 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.776320 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.776340 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.776621 4764 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.776632 4764 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.776662 4764 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.861870 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.862846 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7" exitCode=0 Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.862971 4764 scope.go:117] "RemoveContainer" containerID="e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.862989 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.865857 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6079e5ed-2acc-42f5-a62e-ea2a98b18abd","Type":"ContainerDied","Data":"2b27f2baa5d5f2b2cb7d6fda09a6905faac786ec6165e8062e44b7064fdcbfbc"} Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.865892 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b27f2baa5d5f2b2cb7d6fda09a6905faac786ec6165e8062e44b7064fdcbfbc" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.865914 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.870623 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.871191 4764 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.871535 4764 status_manager.go:851] "Failed to get status for pod" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.876692 4764 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.877271 4764 status_manager.go:851] "Failed to get status for pod" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.877877 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.878045 4764 scope.go:117] "RemoveContainer" containerID="a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.892340 4764 scope.go:117] "RemoveContainer" containerID="c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.906735 4764 scope.go:117] "RemoveContainer" containerID="4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.925435 4764 scope.go:117] "RemoveContainer" containerID="482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.943268 4764 scope.go:117] "RemoveContainer" containerID="a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.962024 4764 scope.go:117] "RemoveContainer" containerID="e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811" Mar 09 13:26:11 crc kubenswrapper[4764]: E0309 13:26:11.962516 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\": container with ID starting with e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811 not found: ID does not exist" containerID="e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.962557 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811"} err="failed to get container status \"e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\": rpc error: code = NotFound desc = could not find container \"e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\": container with ID starting with e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811 not found: ID does not exist" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.962583 4764 scope.go:117] "RemoveContainer" containerID="a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f" Mar 09 13:26:11 crc kubenswrapper[4764]: E0309 13:26:11.962858 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\": container with ID starting with a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f not found: ID does not exist" containerID="a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.962886 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f"} err="failed to get container status \"a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\": rpc error: code = NotFound desc = could not find container \"a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\": container with ID starting with a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f not found: ID does not exist" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.962903 4764 scope.go:117] "RemoveContainer" containerID="c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b" Mar 09 13:26:11 crc kubenswrapper[4764]: E0309 13:26:11.963140 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\": container with ID starting with c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b not found: ID does not exist" containerID="c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.963164 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b"} err="failed to get container status \"c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\": rpc error: code = NotFound desc = could not find container \"c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\": container with ID starting with c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b not found: ID does not exist" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.963180 4764 scope.go:117] "RemoveContainer" containerID="4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6" Mar 09 13:26:11 crc kubenswrapper[4764]: E0309 13:26:11.963422 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\": container with ID starting with 4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6 not found: ID does not exist" containerID="4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.963444 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6"} err="failed to get container status \"4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\": rpc error: code = NotFound desc = could not find container \"4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\": container with ID starting with 4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6 not found: ID does not exist" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.963463 4764 scope.go:117] "RemoveContainer" containerID="482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7" Mar 09 13:26:11 crc kubenswrapper[4764]: E0309 13:26:11.963944 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\": container with ID starting with 482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7 not found: ID does not exist" containerID="482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.963986 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7"} err="failed to get container status \"482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\": rpc error: code = NotFound desc = could not find container \"482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\": container with ID starting with 482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7 not found: ID does not exist" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.964016 4764 scope.go:117] "RemoveContainer" containerID="a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3" Mar 09 13:26:11 crc kubenswrapper[4764]: E0309 13:26:11.964312 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\": container with ID starting with a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3 not found: ID does not exist" containerID="a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.964341 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3"} err="failed to get container status \"a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\": rpc error: code = NotFound desc = could not find container \"a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\": container with ID starting with a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3 not found: ID does not exist" Mar 09 13:26:13 crc kubenswrapper[4764]: I0309 13:26:13.567553 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 09 13:26:15 crc kubenswrapper[4764]: I0309 13:26:15.561852 4764 status_manager.go:851] "Failed to get status for pod" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:15 crc kubenswrapper[4764]: I0309 13:26:15.562429 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:17 crc kubenswrapper[4764]: E0309 13:26:17.381627 4764 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:17 crc kubenswrapper[4764]: E0309 13:26:17.382751 4764 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:17 crc kubenswrapper[4764]: E0309 13:26:17.383158 4764 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:17 crc kubenswrapper[4764]: E0309 13:26:17.383568 4764 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:17 crc kubenswrapper[4764]: E0309 13:26:17.384037 4764 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:17 crc kubenswrapper[4764]: I0309 13:26:17.384082 4764 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 09 13:26:17 crc kubenswrapper[4764]: E0309 13:26:17.384446 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" interval="200ms" Mar 09 13:26:17 crc kubenswrapper[4764]: E0309 13:26:17.586037 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" interval="400ms" Mar 09 13:26:17 crc kubenswrapper[4764]: E0309 13:26:17.986471 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" interval="800ms" Mar 09 13:26:18 crc kubenswrapper[4764]: E0309 13:26:18.787188 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" interval="1.6s" Mar 09 13:26:20 crc kubenswrapper[4764]: E0309 13:26:20.388308 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" interval="3.2s" Mar 09 13:26:20 crc kubenswrapper[4764]: E0309 13:26:20.764449 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.52:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b2f31e152c0df openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:26:09.580622047 +0000 UTC m=+324.830793955,LastTimestamp:2026-03-09 13:26:09.580622047 +0000 UTC m=+324.830793955,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.559067 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.560745 4764 status_manager.go:851] "Failed to get status for pod" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.561901 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.585769 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.585810 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:22 crc kubenswrapper[4764]: E0309 13:26:22.586457 4764 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.587375 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:22 crc kubenswrapper[4764]: W0309 13:26:22.622620 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-05f248f283cd22d284be0ce2b65eee19c9c2cfbe94abfcc012118bce277e2802 WatchSource:0}: Error finding container 05f248f283cd22d284be0ce2b65eee19c9c2cfbe94abfcc012118bce277e2802: Status 404 returned error can't find the container with id 05f248f283cd22d284be0ce2b65eee19c9c2cfbe94abfcc012118bce277e2802 Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.875135 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" podUID="f9b3244b-8df0-4330-9887-4092260d416a" containerName="oauth-openshift" containerID="cri-o://5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3" gracePeriod=15 Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.927800 4764 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="3f15ba85d13eeff185b108cb2355b170c5025452a10f0b464ee7561803e49a28" exitCode=0 Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.927907 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"3f15ba85d13eeff185b108cb2355b170c5025452a10f0b464ee7561803e49a28"} Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.927985 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"05f248f283cd22d284be0ce2b65eee19c9c2cfbe94abfcc012118bce277e2802"} Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.928370 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.928395 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.928792 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:22 crc kubenswrapper[4764]: E0309 13:26:22.928866 4764 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.929164 4764 status_manager.go:851] "Failed to get status for pod" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.328125 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528700 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-error\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528776 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-service-ca\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528805 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-idp-0-file-data\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528831 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-cliconfig\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528851 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-ocp-branding-template\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528873 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f9b3244b-8df0-4330-9887-4092260d416a-audit-dir\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528892 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-provider-selection\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528919 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-serving-cert\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528937 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdsds\" (UniqueName: \"kubernetes.io/projected/f9b3244b-8df0-4330-9887-4092260d416a-kube-api-access-tdsds\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528952 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-session\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528971 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-login\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.529002 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-router-certs\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.529037 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-trusted-ca-bundle\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.529095 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-audit-policies\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.530228 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.530264 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9b3244b-8df0-4330-9887-4092260d416a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.534006 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.534655 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.534925 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.535317 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.535634 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.536879 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.537174 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.537580 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9b3244b-8df0-4330-9887-4092260d416a-kube-api-access-tdsds" (OuterVolumeSpecName: "kube-api-access-tdsds") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "kube-api-access-tdsds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.538893 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.539565 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.540117 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.540247 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631071 4764 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631117 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631137 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631156 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631171 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631188 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631203 4764 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f9b3244b-8df0-4330-9887-4092260d416a-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631220 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631237 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631253 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdsds\" (UniqueName: \"kubernetes.io/projected/f9b3244b-8df0-4330-9887-4092260d416a-kube-api-access-tdsds\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631269 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631286 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631302 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631317 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.934360 4764 generic.go:334] "Generic (PLEG): container finished" podID="f9b3244b-8df0-4330-9887-4092260d416a" containerID="5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3" exitCode=0 Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.934406 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" event={"ID":"f9b3244b-8df0-4330-9887-4092260d416a","Type":"ContainerDied","Data":"5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3"} Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.934450 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" event={"ID":"f9b3244b-8df0-4330-9887-4092260d416a","Type":"ContainerDied","Data":"42c43e046cf7b3c37122d69fdadfdf37b294da8c4d73ad8e7c4ac09039e43f1c"} Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.934474 4764 scope.go:117] "RemoveContainer" containerID="5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.934488 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.941661 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.942892 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.942938 4764 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26" exitCode=1 Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.943015 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26"} Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.943721 4764 scope.go:117] "RemoveContainer" containerID="c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.951202 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"63b66a410c949ae6d3d6dc6b0c563f5b581fecd85b8265b0c00a4c83886e2ce0"} Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.951253 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7e060219163b02e46f8db3481391682e57e0b6de76226ef075e51a64be0bec36"} Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.951266 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"df6c1ff2f65d47f0a3c78d6dc60d07c1b853d55ca9eb1473ea9b5a644648e0a1"} Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.951276 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"183304b472a2d0deb0b611dfab80a08e0d465eddb0d348720b4b779cab544b60"} Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.959134 4764 scope.go:117] "RemoveContainer" containerID="5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3" Mar 09 13:26:23 crc kubenswrapper[4764]: E0309 13:26:23.959782 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3\": container with ID starting with 5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3 not found: ID does not exist" containerID="5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.959816 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3"} err="failed to get container status \"5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3\": rpc error: code = NotFound desc = could not find container \"5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3\": container with ID starting with 5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3 not found: ID does not exist" Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.643171 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.643428 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.643458 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.643488 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.963902 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.965157 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.965243 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fb20688a647451d7fe5639f828c352ed485f469ea775fff8f4e5ab64cd8e6498"} Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.969952 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"24bdf5877ca80ce2fd116bfc5e95f5b85a8ad7217195967412e57a418efae285"} Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.970320 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.970347 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.971736 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:25 crc kubenswrapper[4764]: E0309 13:26:25.644423 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 13:26:25 crc kubenswrapper[4764]: E0309 13:26:25.644493 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:26:25 crc kubenswrapper[4764]: E0309 13:26:25.644514 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:26:25 crc kubenswrapper[4764]: E0309 13:26:25.644549 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:26:25 crc kubenswrapper[4764]: E0309 13:26:25.644520 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:28:27.644497082 +0000 UTC m=+462.894668990 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:26:25 crc kubenswrapper[4764]: E0309 13:26:25.644628 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:28:27.644608985 +0000 UTC m=+462.894780893 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Mar 09 13:26:26 crc kubenswrapper[4764]: E0309 13:26:26.644992 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:26:26 crc kubenswrapper[4764]: E0309 13:26:26.645022 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:26:26 crc kubenswrapper[4764]: E0309 13:26:26.645055 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:26:26 crc kubenswrapper[4764]: E0309 13:26:26.645078 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:26:26 crc kubenswrapper[4764]: E0309 13:26:26.645177 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:28:28.645141737 +0000 UTC m=+463.895313655 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Mar 09 13:26:26 crc kubenswrapper[4764]: E0309 13:26:26.645205 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:28:28.645194838 +0000 UTC m=+463.895366756 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Mar 09 13:26:27 crc kubenswrapper[4764]: I0309 13:26:27.588452 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:27 crc kubenswrapper[4764]: I0309 13:26:27.588890 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:27 crc kubenswrapper[4764]: I0309 13:26:27.596153 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:29 crc kubenswrapper[4764]: I0309 13:26:29.649759 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 09 13:26:29 crc kubenswrapper[4764]: I0309 13:26:29.649763 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 09 13:26:29 crc kubenswrapper[4764]: I0309 13:26:29.649971 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 09 13:26:29 crc kubenswrapper[4764]: I0309 13:26:29.980448 4764 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:30 crc kubenswrapper[4764]: I0309 13:26:30.568195 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:26:30 crc kubenswrapper[4764]: I0309 13:26:30.568386 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 09 13:26:30 crc kubenswrapper[4764]: I0309 13:26:30.568432 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 09 13:26:30 crc kubenswrapper[4764]: I0309 13:26:30.647874 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 09 13:26:31 crc kubenswrapper[4764]: I0309 13:26:31.007984 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_71bb4a3aecc4ba5b26c4b7318770ce13/kube-apiserver-check-endpoints/0.log" Mar 09 13:26:31 crc kubenswrapper[4764]: I0309 13:26:31.010258 4764 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="24bdf5877ca80ce2fd116bfc5e95f5b85a8ad7217195967412e57a418efae285" exitCode=255 Mar 09 13:26:31 crc kubenswrapper[4764]: I0309 13:26:31.010307 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"24bdf5877ca80ce2fd116bfc5e95f5b85a8ad7217195967412e57a418efae285"} Mar 09 13:26:31 crc kubenswrapper[4764]: I0309 13:26:31.010739 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:31 crc kubenswrapper[4764]: I0309 13:26:31.010793 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:31 crc kubenswrapper[4764]: I0309 13:26:31.014034 4764 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="9a603f3e-3222-4fb3-8ede-6d83ee2e90f5" Mar 09 13:26:31 crc kubenswrapper[4764]: I0309 13:26:31.014497 4764 scope.go:117] "RemoveContainer" containerID="24bdf5877ca80ce2fd116bfc5e95f5b85a8ad7217195967412e57a418efae285" Mar 09 13:26:31 crc kubenswrapper[4764]: I0309 13:26:31.015361 4764 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://183304b472a2d0deb0b611dfab80a08e0d465eddb0d348720b4b779cab544b60" Mar 09 13:26:31 crc kubenswrapper[4764]: I0309 13:26:31.015386 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:32 crc kubenswrapper[4764]: I0309 13:26:32.016269 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_71bb4a3aecc4ba5b26c4b7318770ce13/kube-apiserver-check-endpoints/0.log" Mar 09 13:26:32 crc kubenswrapper[4764]: I0309 13:26:32.017900 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cf28d9c9b484bede2e19ccb05028dda2bfdc5dce7bd6007e9619002f8a6be71f"} Mar 09 13:26:32 crc kubenswrapper[4764]: I0309 13:26:32.018134 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:32 crc kubenswrapper[4764]: I0309 13:26:32.018215 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:32 crc kubenswrapper[4764]: I0309 13:26:32.018339 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:32 crc kubenswrapper[4764]: I0309 13:26:32.121636 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:26:33 crc kubenswrapper[4764]: I0309 13:26:33.022525 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:33 crc kubenswrapper[4764]: I0309 13:26:33.022551 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:35 crc kubenswrapper[4764]: E0309 13:26:35.572002 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:26:35 crc kubenswrapper[4764]: I0309 13:26:35.575403 4764 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="9a603f3e-3222-4fb3-8ede-6d83ee2e90f5" Mar 09 13:26:35 crc kubenswrapper[4764]: E0309 13:26:35.579911 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:26:35 crc kubenswrapper[4764]: E0309 13:26:35.585489 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:26:39 crc kubenswrapper[4764]: I0309 13:26:39.511604 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 09 13:26:40 crc kubenswrapper[4764]: I0309 13:26:40.569122 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 09 13:26:40 crc kubenswrapper[4764]: I0309 13:26:40.569208 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 09 13:26:41 crc kubenswrapper[4764]: I0309 13:26:41.016934 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 13:26:41 crc kubenswrapper[4764]: I0309 13:26:41.572446 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 09 13:26:41 crc kubenswrapper[4764]: I0309 13:26:41.583606 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 09 13:26:41 crc kubenswrapper[4764]: I0309 13:26:41.687352 4764 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 09 13:26:41 crc kubenswrapper[4764]: I0309 13:26:41.812514 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 09 13:26:42 crc kubenswrapper[4764]: I0309 13:26:42.142236 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 09 13:26:42 crc kubenswrapper[4764]: I0309 13:26:42.416970 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 09 13:26:42 crc kubenswrapper[4764]: I0309 13:26:42.817593 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 09 13:26:42 crc kubenswrapper[4764]: I0309 13:26:42.820881 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 09 13:26:42 crc kubenswrapper[4764]: I0309 13:26:42.855792 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 09 13:26:42 crc kubenswrapper[4764]: I0309 13:26:42.925357 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 09 13:26:43 crc kubenswrapper[4764]: I0309 13:26:43.163289 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 09 13:26:43 crc kubenswrapper[4764]: I0309 13:26:43.334825 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 09 13:26:43 crc kubenswrapper[4764]: I0309 13:26:43.358602 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 09 13:26:43 crc kubenswrapper[4764]: I0309 13:26:43.569505 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 09 13:26:43 crc kubenswrapper[4764]: I0309 13:26:43.728710 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 09 13:26:43 crc kubenswrapper[4764]: I0309 13:26:43.831047 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 09 13:26:43 crc kubenswrapper[4764]: I0309 13:26:43.915688 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 09 13:26:43 crc kubenswrapper[4764]: I0309 13:26:43.924124 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.023754 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.174393 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.224741 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.225682 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.236845 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.308969 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.390316 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.432366 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.469210 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.490744 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.522574 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.558025 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.558592 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.647498 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.648241 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.723234 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.735940 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.780765 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.927459 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.982733 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.986845 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.023004 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.039993 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.048287 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.105022 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.122868 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.127437 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.174771 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.267070 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.298014 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.362999 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.486093 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.590059 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.620761 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.639573 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.645314 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.758833 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.802379 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.809578 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.814616 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.829574 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.184394 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.228943 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.289829 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.507470 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.559792 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.613695 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.665370 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.665419 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.756635 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.771565 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.829556 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.892377 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.900235 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.037195 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.152213 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.202735 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.223973 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.226478 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.235999 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.325310 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.343104 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.394857 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.440335 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.498201 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.500567 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.520524 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.651713 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.725721 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.751694 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.752281 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.094748 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.143158 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.161153 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.167018 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.194145 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.211696 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.383108 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.572208 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.586147 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.611052 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.654963 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.682769 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.714887 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.723293 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.725179 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.746703 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.773921 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.812323 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.813535 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.813943 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.970790 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.047164 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.049434 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.053243 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.056141 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.135775 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.136057 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.177245 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.200632 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.260815 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.516941 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.531181 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.556902 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.643380 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.703147 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.763860 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.784787 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.932208 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.166823 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.176305 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.233754 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.305474 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.333829 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.377834 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.389582 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.419716 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.425787 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.446635 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.530109 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.531953 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.559190 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.559285 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.569035 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.569104 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.569144 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.569519 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"fb20688a647451d7fe5639f828c352ed485f469ea775fff8f4e5ab64cd8e6498"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.569666 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://fb20688a647451d7fe5639f828c352ed485f469ea775fff8f4e5ab64cd8e6498" gracePeriod=30 Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.571454 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.578776 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.591779 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.612148 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.648357 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.698484 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.778170 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.826615 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.878898 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.933714 4764 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.966634 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.204226 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.262649 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.275222 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.308256 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.326383 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.364981 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.438691 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.484072 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.493701 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.500843 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.519404 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.536522 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.559737 4764 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.562145 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.623610 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.761398 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.829105 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.847288 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.107327 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.159877 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.218166 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.218283 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.368180 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.463875 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.530881 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.538894 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.663412 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.694929 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.710053 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.804024 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.840945 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.883442 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.959587 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.985851 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.196168 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.265381 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.331227 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.337073 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.446908 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.487943 4764 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.497482 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.512234 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.519470 4764 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.520140 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=44.520122512 podStartE2EDuration="44.520122512s" podCreationTimestamp="2026-03-09 13:26:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:26:29.66090189 +0000 UTC m=+344.911073808" watchObservedRunningTime="2026-03-09 13:26:53.520122512 +0000 UTC m=+368.770294430" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.525525 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nj856","openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.525584 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-fb6b676c8-zsq8z","openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 13:26:53 crc kubenswrapper[4764]: E0309 13:26:53.525895 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b3244b-8df0-4330-9887-4092260d416a" containerName="oauth-openshift" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.525915 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b3244b-8df0-4330-9887-4092260d416a" containerName="oauth-openshift" Mar 09 13:26:53 crc kubenswrapper[4764]: E0309 13:26:53.525940 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" containerName="installer" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.525948 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" containerName="installer" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.526055 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9b3244b-8df0-4330-9887-4092260d416a" containerName="oauth-openshift" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.526071 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" containerName="installer" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.526083 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.526108 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.526453 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7k9k"] Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.526594 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.526709 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g7k9k" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerName="registry-server" containerID="cri-o://018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a" gracePeriod=2 Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.531916 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.532148 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.532295 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.532608 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.533251 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.533425 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.533747 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.534084 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.535662 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.537623 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.538026 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.538190 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.538568 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.540211 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.544605 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.545953 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.546732 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.557020 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.558100 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.567517 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=24.567492697 podStartE2EDuration="24.567492697s" podCreationTimestamp="2026-03-09 13:26:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:26:53.563555319 +0000 UTC m=+368.813727237" watchObservedRunningTime="2026-03-09 13:26:53.567492697 +0000 UTC m=+368.817664615" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.568790 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9b3244b-8df0-4330-9887-4092260d416a" path="/var/lib/kubelet/pods/f9b3244b-8df0-4330-9887-4092260d416a/volumes" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.625578 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.648491 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-session\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649028 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-template-error\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649064 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649095 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649264 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/30bdf0f7-c597-42b1-80a1-20dd593c3333-audit-dir\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649469 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649550 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649603 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k489f\" (UniqueName: \"kubernetes.io/projected/30bdf0f7-c597-42b1-80a1-20dd593c3333-kube-api-access-k489f\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649633 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-router-certs\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649680 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-cliconfig\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649728 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-audit-policies\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649900 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-serving-cert\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649936 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-template-login\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649975 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-service-ca\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.705529 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751131 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751193 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751217 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k489f\" (UniqueName: \"kubernetes.io/projected/30bdf0f7-c597-42b1-80a1-20dd593c3333-kube-api-access-k489f\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751234 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-cliconfig\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751255 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-router-certs\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751290 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-audit-policies\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751330 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-serving-cert\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751350 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-template-login\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751371 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-service-ca\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751420 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-session\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751436 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-template-error\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751453 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751470 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751493 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/30bdf0f7-c597-42b1-80a1-20dd593c3333-audit-dir\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751578 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/30bdf0f7-c597-42b1-80a1-20dd593c3333-audit-dir\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.752351 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.752947 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-cliconfig\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.752959 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-service-ca\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.753577 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-audit-policies\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.756817 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.756900 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.757200 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-router-certs\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.757264 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-session\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.757610 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-serving-cert\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.757875 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-template-login\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.758632 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-template-error\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.759877 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.770969 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k489f\" (UniqueName: \"kubernetes.io/projected/30bdf0f7-c597-42b1-80a1-20dd593c3333-kube-api-access-k489f\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.816168 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.845699 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.851971 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.934907 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.958983 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.982445 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.985775 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.061272 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.137326 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.156972 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-utilities\") pod \"7a967c79-e11e-4c58-b42e-652d1406ac88\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.157162 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx2jl\" (UniqueName: \"kubernetes.io/projected/7a967c79-e11e-4c58-b42e-652d1406ac88-kube-api-access-gx2jl\") pod \"7a967c79-e11e-4c58-b42e-652d1406ac88\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.157221 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-catalog-content\") pod \"7a967c79-e11e-4c58-b42e-652d1406ac88\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.160472 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-utilities" (OuterVolumeSpecName: "utilities") pod "7a967c79-e11e-4c58-b42e-652d1406ac88" (UID: "7a967c79-e11e-4c58-b42e-652d1406ac88"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.164020 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a967c79-e11e-4c58-b42e-652d1406ac88-kube-api-access-gx2jl" (OuterVolumeSpecName: "kube-api-access-gx2jl") pod "7a967c79-e11e-4c58-b42e-652d1406ac88" (UID: "7a967c79-e11e-4c58-b42e-652d1406ac88"). InnerVolumeSpecName "kube-api-access-gx2jl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.172729 4764 generic.go:334] "Generic (PLEG): container finished" podID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerID="018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a" exitCode=0 Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.172835 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.173474 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7k9k" event={"ID":"7a967c79-e11e-4c58-b42e-652d1406ac88","Type":"ContainerDied","Data":"018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a"} Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.173512 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7k9k" event={"ID":"7a967c79-e11e-4c58-b42e-652d1406ac88","Type":"ContainerDied","Data":"2183e838c5144408fdc015b8deb0cb2c5e715404d51e8b64aa5f21859f0ebf3c"} Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.173532 4764 scope.go:117] "RemoveContainer" containerID="018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.194259 4764 scope.go:117] "RemoveContainer" containerID="5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.198923 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a967c79-e11e-4c58-b42e-652d1406ac88" (UID: "7a967c79-e11e-4c58-b42e-652d1406ac88"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.214769 4764 scope.go:117] "RemoveContainer" containerID="a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.230972 4764 scope.go:117] "RemoveContainer" containerID="018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a" Mar 09 13:26:54 crc kubenswrapper[4764]: E0309 13:26:54.231647 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a\": container with ID starting with 018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a not found: ID does not exist" containerID="018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.231699 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a"} err="failed to get container status \"018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a\": rpc error: code = NotFound desc = could not find container \"018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a\": container with ID starting with 018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a not found: ID does not exist" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.231722 4764 scope.go:117] "RemoveContainer" containerID="5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861" Mar 09 13:26:54 crc kubenswrapper[4764]: E0309 13:26:54.232134 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861\": container with ID starting with 5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861 not found: ID does not exist" containerID="5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.232210 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861"} err="failed to get container status \"5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861\": rpc error: code = NotFound desc = could not find container \"5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861\": container with ID starting with 5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861 not found: ID does not exist" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.232300 4764 scope.go:117] "RemoveContainer" containerID="a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca" Mar 09 13:26:54 crc kubenswrapper[4764]: E0309 13:26:54.232983 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca\": container with ID starting with a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca not found: ID does not exist" containerID="a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.233038 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca"} err="failed to get container status \"a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca\": rpc error: code = NotFound desc = could not find container \"a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca\": container with ID starting with a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca not found: ID does not exist" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.260003 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx2jl\" (UniqueName: \"kubernetes.io/projected/7a967c79-e11e-4c58-b42e-652d1406ac88-kube-api-access-gx2jl\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.260030 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.260038 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.289737 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-fb6b676c8-zsq8z"] Mar 09 13:26:54 crc kubenswrapper[4764]: W0309 13:26:54.292374 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30bdf0f7_c597_42b1_80a1_20dd593c3333.slice/crio-e2458c779c30a86ce1ffd2d7c27698d8f52d1ee90a69670513e1fb9403db4910 WatchSource:0}: Error finding container e2458c779c30a86ce1ffd2d7c27698d8f52d1ee90a69670513e1fb9403db4910: Status 404 returned error can't find the container with id e2458c779c30a86ce1ffd2d7c27698d8f52d1ee90a69670513e1fb9403db4910 Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.502411 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7k9k"] Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.508914 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7k9k"] Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.583503 4764 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.663312 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.778787 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.810030 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.846390 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.088875 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.179591 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" event={"ID":"30bdf0f7-c597-42b1-80a1-20dd593c3333","Type":"ContainerStarted","Data":"fbf8ad1016fa6322c6426f8e0aa6c9b8ce7c5b7f4dd4a83bfa7292d4743055e4"} Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.179637 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" event={"ID":"30bdf0f7-c597-42b1-80a1-20dd593c3333","Type":"ContainerStarted","Data":"e2458c779c30a86ce1ffd2d7c27698d8f52d1ee90a69670513e1fb9403db4910"} Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.179968 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.185582 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.198415 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" podStartSLOduration=58.198400025 podStartE2EDuration="58.198400025s" podCreationTimestamp="2026-03-09 13:25:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:26:55.198182589 +0000 UTC m=+370.448354507" watchObservedRunningTime="2026-03-09 13:26:55.198400025 +0000 UTC m=+370.448571933" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.226452 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.411833 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.424490 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.443096 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.479278 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.542499 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.574535 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" path="/var/lib/kubelet/pods/7a967c79-e11e-4c58-b42e-652d1406ac88/volumes" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.620292 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.835156 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.835972 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.919068 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.919152 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 09 13:26:56 crc kubenswrapper[4764]: I0309 13:26:56.167143 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 09 13:26:56 crc kubenswrapper[4764]: I0309 13:26:56.203141 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 09 13:26:56 crc kubenswrapper[4764]: I0309 13:26:56.463675 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 09 13:26:56 crc kubenswrapper[4764]: I0309 13:26:56.634353 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 09 13:26:56 crc kubenswrapper[4764]: I0309 13:26:56.655196 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 09 13:26:57 crc kubenswrapper[4764]: I0309 13:26:57.301948 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 09 13:26:57 crc kubenswrapper[4764]: I0309 13:26:57.650688 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 09 13:26:57 crc kubenswrapper[4764]: I0309 13:26:57.763438 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 09 13:26:57 crc kubenswrapper[4764]: I0309 13:26:57.788131 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 09 13:26:59 crc kubenswrapper[4764]: I0309 13:26:59.924739 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 09 13:27:03 crc kubenswrapper[4764]: I0309 13:27:03.595521 4764 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 13:27:03 crc kubenswrapper[4764]: I0309 13:27:03.596036 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772" gracePeriod=5 Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.173184 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.174061 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.257611 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.258087 4764 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772" exitCode=137 Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.258168 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.258175 4764 scope.go:117] "RemoveContainer" containerID="34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.275796 4764 scope.go:117] "RemoveContainer" containerID="34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772" Mar 09 13:27:09 crc kubenswrapper[4764]: E0309 13:27:09.276364 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772\": container with ID starting with 34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772 not found: ID does not exist" containerID="34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.276406 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772"} err="failed to get container status \"34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772\": rpc error: code = NotFound desc = could not find container \"34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772\": container with ID starting with 34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772 not found: ID does not exist" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.366823 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.366892 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.366946 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.366980 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.367038 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.367488 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.367545 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.367625 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.367958 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.376140 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.468587 4764 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.468673 4764 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.468686 4764 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.468695 4764 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.468707 4764 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.566885 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.567168 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.582560 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.582626 4764 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="9b9d0900-658e-4494-b290-05115386e626" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.586980 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.587014 4764 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="9b9d0900-658e-4494-b290-05115386e626" Mar 09 13:27:21 crc kubenswrapper[4764]: I0309 13:27:21.351316 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 09 13:27:21 crc kubenswrapper[4764]: I0309 13:27:21.354206 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 13:27:21 crc kubenswrapper[4764]: I0309 13:27:21.355469 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 13:27:21 crc kubenswrapper[4764]: I0309 13:27:21.355518 4764 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="fb20688a647451d7fe5639f828c352ed485f469ea775fff8f4e5ab64cd8e6498" exitCode=137 Mar 09 13:27:21 crc kubenswrapper[4764]: I0309 13:27:21.355550 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"fb20688a647451d7fe5639f828c352ed485f469ea775fff8f4e5ab64cd8e6498"} Mar 09 13:27:21 crc kubenswrapper[4764]: I0309 13:27:21.355590 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2ab451f1911e48e54d9d1932bca99cb132f443fb9a8d6dc8e6229a6e072ef358"} Mar 09 13:27:21 crc kubenswrapper[4764]: I0309 13:27:21.355612 4764 scope.go:117] "RemoveContainer" containerID="c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26" Mar 09 13:27:22 crc kubenswrapper[4764]: I0309 13:27:22.121269 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:27:22 crc kubenswrapper[4764]: I0309 13:27:22.366275 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 09 13:27:22 crc kubenswrapper[4764]: I0309 13:27:22.367952 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 13:27:30 crc kubenswrapper[4764]: I0309 13:27:30.568440 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:27:30 crc kubenswrapper[4764]: I0309 13:27:30.575342 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:27:31 crc kubenswrapper[4764]: I0309 13:27:31.418451 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:27:58 crc kubenswrapper[4764]: I0309 13:27:58.370731 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:27:58 crc kubenswrapper[4764]: I0309 13:27:58.371381 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.204961 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551048-fgf8g"] Mar 09 13:28:00 crc kubenswrapper[4764]: E0309 13:28:00.205804 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerName="extract-utilities" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.205829 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerName="extract-utilities" Mar 09 13:28:00 crc kubenswrapper[4764]: E0309 13:28:00.205865 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerName="extract-content" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.205878 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerName="extract-content" Mar 09 13:28:00 crc kubenswrapper[4764]: E0309 13:28:00.205905 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerName="registry-server" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.205922 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerName="registry-server" Mar 09 13:28:00 crc kubenswrapper[4764]: E0309 13:28:00.205946 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.205958 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.206380 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerName="registry-server" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.206444 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.207334 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551048-fgf8g" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.209974 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.211563 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.212075 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.244285 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd5xb\" (UniqueName: \"kubernetes.io/projected/0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f-kube-api-access-cd5xb\") pod \"auto-csr-approver-29551048-fgf8g\" (UID: \"0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f\") " pod="openshift-infra/auto-csr-approver-29551048-fgf8g" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.249582 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551048-fgf8g"] Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.345971 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd5xb\" (UniqueName: \"kubernetes.io/projected/0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f-kube-api-access-cd5xb\") pod \"auto-csr-approver-29551048-fgf8g\" (UID: \"0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f\") " pod="openshift-infra/auto-csr-approver-29551048-fgf8g" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.373943 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd5xb\" (UniqueName: \"kubernetes.io/projected/0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f-kube-api-access-cd5xb\") pod \"auto-csr-approver-29551048-fgf8g\" (UID: \"0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f\") " pod="openshift-infra/auto-csr-approver-29551048-fgf8g" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.541111 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551048-fgf8g" Mar 09 13:28:01 crc kubenswrapper[4764]: I0309 13:28:01.006035 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551048-fgf8g"] Mar 09 13:28:01 crc kubenswrapper[4764]: I0309 13:28:01.616589 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551048-fgf8g" event={"ID":"0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f","Type":"ContainerStarted","Data":"dff5be776046969c512855e99368dd5e3ffa23c7cd0821f06ec61907f21c8cae"} Mar 09 13:28:02 crc kubenswrapper[4764]: I0309 13:28:02.626031 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551048-fgf8g" event={"ID":"0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f","Type":"ContainerStarted","Data":"f44561c47745677a2b6e923d3f449a7c01740fc8b6e465eeb90cec0a7d1ebe67"} Mar 09 13:28:02 crc kubenswrapper[4764]: I0309 13:28:02.641696 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551048-fgf8g" podStartSLOduration=1.454571248 podStartE2EDuration="2.641677917s" podCreationTimestamp="2026-03-09 13:28:00 +0000 UTC" firstStartedPulling="2026-03-09 13:28:01.021756345 +0000 UTC m=+436.271928253" lastFinishedPulling="2026-03-09 13:28:02.208863014 +0000 UTC m=+437.459034922" observedRunningTime="2026-03-09 13:28:02.639469067 +0000 UTC m=+437.889640975" watchObservedRunningTime="2026-03-09 13:28:02.641677917 +0000 UTC m=+437.891849825" Mar 09 13:28:03 crc kubenswrapper[4764]: I0309 13:28:03.635552 4764 generic.go:334] "Generic (PLEG): container finished" podID="0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f" containerID="f44561c47745677a2b6e923d3f449a7c01740fc8b6e465eeb90cec0a7d1ebe67" exitCode=0 Mar 09 13:28:03 crc kubenswrapper[4764]: I0309 13:28:03.635607 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551048-fgf8g" event={"ID":"0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f","Type":"ContainerDied","Data":"f44561c47745677a2b6e923d3f449a7c01740fc8b6e465eeb90cec0a7d1ebe67"} Mar 09 13:28:04 crc kubenswrapper[4764]: I0309 13:28:04.973290 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551048-fgf8g" Mar 09 13:28:05 crc kubenswrapper[4764]: I0309 13:28:05.119568 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd5xb\" (UniqueName: \"kubernetes.io/projected/0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f-kube-api-access-cd5xb\") pod \"0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f\" (UID: \"0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f\") " Mar 09 13:28:05 crc kubenswrapper[4764]: I0309 13:28:05.126800 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f-kube-api-access-cd5xb" (OuterVolumeSpecName: "kube-api-access-cd5xb") pod "0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f" (UID: "0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f"). InnerVolumeSpecName "kube-api-access-cd5xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:28:05 crc kubenswrapper[4764]: I0309 13:28:05.222293 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd5xb\" (UniqueName: \"kubernetes.io/projected/0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f-kube-api-access-cd5xb\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:05 crc kubenswrapper[4764]: I0309 13:28:05.658043 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551048-fgf8g" Mar 09 13:28:05 crc kubenswrapper[4764]: I0309 13:28:05.657895 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551048-fgf8g" event={"ID":"0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f","Type":"ContainerDied","Data":"dff5be776046969c512855e99368dd5e3ffa23c7cd0821f06ec61907f21c8cae"} Mar 09 13:28:05 crc kubenswrapper[4764]: I0309 13:28:05.660327 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dff5be776046969c512855e99368dd5e3ffa23c7cd0821f06ec61907f21c8cae" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.162726 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8d627"] Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.163545 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8d627" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerName="registry-server" containerID="cri-o://5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1" gracePeriod=30 Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.187914 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nrc8s"] Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.188191 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nrc8s" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerName="registry-server" containerID="cri-o://0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711" gracePeriod=30 Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.192277 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d4gwh"] Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.194719 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" podUID="1ccc5b44-95ad-4f4c-8086-c176c41bbd19" containerName="marketplace-operator" containerID="cri-o://2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666" gracePeriod=30 Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.218579 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m2gc7"] Mar 09 13:28:13 crc kubenswrapper[4764]: E0309 13:28:13.218950 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f" containerName="oc" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.218965 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f" containerName="oc" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.219069 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f" containerName="oc" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.219463 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.241503 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhs57"] Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.241860 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qhs57" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerName="registry-server" containerID="cri-o://d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214" gracePeriod=30 Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.253688 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m2gc7"] Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.264449 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tll5t"] Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.264780 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tll5t" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerName="registry-server" containerID="cri-o://4325cf9007df54fa3b2a5bed7103ccce2df4b9e7e47622fe04491c8fac8d1503" gracePeriod=30 Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.326854 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4351c9fc-c207-4d15-b8a6-f51c0651fe83-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m2gc7\" (UID: \"4351c9fc-c207-4d15-b8a6-f51c0651fe83\") " pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.326900 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hwt8\" (UniqueName: \"kubernetes.io/projected/4351c9fc-c207-4d15-b8a6-f51c0651fe83-kube-api-access-4hwt8\") pod \"marketplace-operator-79b997595-m2gc7\" (UID: \"4351c9fc-c207-4d15-b8a6-f51c0651fe83\") " pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.326928 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4351c9fc-c207-4d15-b8a6-f51c0651fe83-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m2gc7\" (UID: \"4351c9fc-c207-4d15-b8a6-f51c0651fe83\") " pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.428631 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4351c9fc-c207-4d15-b8a6-f51c0651fe83-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m2gc7\" (UID: \"4351c9fc-c207-4d15-b8a6-f51c0651fe83\") " pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.428713 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hwt8\" (UniqueName: \"kubernetes.io/projected/4351c9fc-c207-4d15-b8a6-f51c0651fe83-kube-api-access-4hwt8\") pod \"marketplace-operator-79b997595-m2gc7\" (UID: \"4351c9fc-c207-4d15-b8a6-f51c0651fe83\") " pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.428753 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4351c9fc-c207-4d15-b8a6-f51c0651fe83-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m2gc7\" (UID: \"4351c9fc-c207-4d15-b8a6-f51c0651fe83\") " pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.430165 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4351c9fc-c207-4d15-b8a6-f51c0651fe83-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m2gc7\" (UID: \"4351c9fc-c207-4d15-b8a6-f51c0651fe83\") " pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.436601 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4351c9fc-c207-4d15-b8a6-f51c0651fe83-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m2gc7\" (UID: \"4351c9fc-c207-4d15-b8a6-f51c0651fe83\") " pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.449937 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hwt8\" (UniqueName: \"kubernetes.io/projected/4351c9fc-c207-4d15-b8a6-f51c0651fe83-kube-api-access-4hwt8\") pod \"marketplace-operator-79b997595-m2gc7\" (UID: \"4351c9fc-c207-4d15-b8a6-f51c0651fe83\") " pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.539203 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.670757 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.687093 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.698946 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.721154 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.743567 4764 generic.go:334] "Generic (PLEG): container finished" podID="1ccc5b44-95ad-4f4c-8086-c176c41bbd19" containerID="2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666" exitCode=0 Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.743681 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.744154 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" event={"ID":"1ccc5b44-95ad-4f4c-8086-c176c41bbd19","Type":"ContainerDied","Data":"2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666"} Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.744205 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" event={"ID":"1ccc5b44-95ad-4f4c-8086-c176c41bbd19","Type":"ContainerDied","Data":"771cd63965fde5f5f03cba604e9f4e1989cf6a4881a27fbd710be5727898d90a"} Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.744226 4764 scope.go:117] "RemoveContainer" containerID="2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.756448 4764 generic.go:334] "Generic (PLEG): container finished" podID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerID="0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711" exitCode=0 Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.756523 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrc8s" event={"ID":"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97","Type":"ContainerDied","Data":"0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711"} Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.756556 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrc8s" event={"ID":"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97","Type":"ContainerDied","Data":"32332cee515b03550931490beaabd836e1f122b91e9186c7afe19395bde21caa"} Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.756629 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.760751 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.764461 4764 generic.go:334] "Generic (PLEG): container finished" podID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerID="5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1" exitCode=0 Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.764530 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d627" event={"ID":"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b","Type":"ContainerDied","Data":"5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1"} Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.764557 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.764565 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d627" event={"ID":"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b","Type":"ContainerDied","Data":"afb89024f1733e90994f05a617baca8bd2578c08f57794c20e8e0031fa2f63e4"} Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.766258 4764 scope.go:117] "RemoveContainer" containerID="2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666" Mar 09 13:28:13 crc kubenswrapper[4764]: E0309 13:28:13.766603 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666\": container with ID starting with 2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666 not found: ID does not exist" containerID="2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.766637 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666"} err="failed to get container status \"2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666\": rpc error: code = NotFound desc = could not find container \"2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666\": container with ID starting with 2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666 not found: ID does not exist" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.766681 4764 scope.go:117] "RemoveContainer" containerID="0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.772221 4764 generic.go:334] "Generic (PLEG): container finished" podID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerID="4325cf9007df54fa3b2a5bed7103ccce2df4b9e7e47622fe04491c8fac8d1503" exitCode=0 Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.772324 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tll5t" event={"ID":"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6","Type":"ContainerDied","Data":"4325cf9007df54fa3b2a5bed7103ccce2df4b9e7e47622fe04491c8fac8d1503"} Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.772742 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.779068 4764 generic.go:334] "Generic (PLEG): container finished" podID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerID="d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214" exitCode=0 Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.779121 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhs57" event={"ID":"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df","Type":"ContainerDied","Data":"d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214"} Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.779154 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhs57" event={"ID":"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df","Type":"ContainerDied","Data":"eabffbe2f3a51c427a01ad46e2c40728c19297f3e8e305f2763268cbfbeb6ba0"} Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.779304 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.796927 4764 scope.go:117] "RemoveContainer" containerID="bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.824036 4764 scope.go:117] "RemoveContainer" containerID="fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837442 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fsvt\" (UniqueName: \"kubernetes.io/projected/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-kube-api-access-5fsvt\") pod \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837564 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-utilities\") pod \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837633 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-operator-metrics\") pod \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837691 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-utilities\") pod \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837753 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-trusted-ca\") pod \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837782 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-catalog-content\") pod \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837805 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff292\" (UniqueName: \"kubernetes.io/projected/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-kube-api-access-ff292\") pod \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837828 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4z7x\" (UniqueName: \"kubernetes.io/projected/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-kube-api-access-j4z7x\") pod \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837866 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-utilities\") pod \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837893 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-catalog-content\") pod \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837948 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s85j9\" (UniqueName: \"kubernetes.io/projected/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-kube-api-access-s85j9\") pod \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837969 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-catalog-content\") pod \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.838615 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-utilities" (OuterVolumeSpecName: "utilities") pod "88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" (UID: "88ba6041-7f8f-48f0-840c-8ea2a9bdc87b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.839087 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "1ccc5b44-95ad-4f4c-8086-c176c41bbd19" (UID: "1ccc5b44-95ad-4f4c-8086-c176c41bbd19"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.839173 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-utilities" (OuterVolumeSpecName: "utilities") pod "be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" (UID: "be22cbfb-d3e7-43c1-be38-f6fcadeb2c97"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.840290 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-utilities" (OuterVolumeSpecName: "utilities") pod "691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" (UID: "691ffa6f-3ee6-47fa-bcef-9fdd74ac86df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.842339 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "1ccc5b44-95ad-4f4c-8086-c176c41bbd19" (UID: "1ccc5b44-95ad-4f4c-8086-c176c41bbd19"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.842695 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-kube-api-access-ff292" (OuterVolumeSpecName: "kube-api-access-ff292") pod "be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" (UID: "be22cbfb-d3e7-43c1-be38-f6fcadeb2c97"). InnerVolumeSpecName "kube-api-access-ff292". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.843484 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-kube-api-access-j4z7x" (OuterVolumeSpecName: "kube-api-access-j4z7x") pod "691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" (UID: "691ffa6f-3ee6-47fa-bcef-9fdd74ac86df"). InnerVolumeSpecName "kube-api-access-j4z7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.843537 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-kube-api-access-s85j9" (OuterVolumeSpecName: "kube-api-access-s85j9") pod "1ccc5b44-95ad-4f4c-8086-c176c41bbd19" (UID: "1ccc5b44-95ad-4f4c-8086-c176c41bbd19"). InnerVolumeSpecName "kube-api-access-s85j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.848821 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-kube-api-access-5fsvt" (OuterVolumeSpecName: "kube-api-access-5fsvt") pod "88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" (UID: "88ba6041-7f8f-48f0-840c-8ea2a9bdc87b"). InnerVolumeSpecName "kube-api-access-5fsvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.857179 4764 scope.go:117] "RemoveContainer" containerID="0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711" Mar 09 13:28:13 crc kubenswrapper[4764]: E0309 13:28:13.857675 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711\": container with ID starting with 0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711 not found: ID does not exist" containerID="0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.857719 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711"} err="failed to get container status \"0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711\": rpc error: code = NotFound desc = could not find container \"0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711\": container with ID starting with 0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711 not found: ID does not exist" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.857746 4764 scope.go:117] "RemoveContainer" containerID="bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8" Mar 09 13:28:13 crc kubenswrapper[4764]: E0309 13:28:13.858122 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8\": container with ID starting with bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8 not found: ID does not exist" containerID="bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.858168 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8"} err="failed to get container status \"bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8\": rpc error: code = NotFound desc = could not find container \"bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8\": container with ID starting with bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8 not found: ID does not exist" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.858194 4764 scope.go:117] "RemoveContainer" containerID="fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a" Mar 09 13:28:13 crc kubenswrapper[4764]: E0309 13:28:13.858484 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a\": container with ID starting with fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a not found: ID does not exist" containerID="fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.858515 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a"} err="failed to get container status \"fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a\": rpc error: code = NotFound desc = could not find container \"fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a\": container with ID starting with fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a not found: ID does not exist" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.858536 4764 scope.go:117] "RemoveContainer" containerID="5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.873944 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" (UID: "691ffa6f-3ee6-47fa-bcef-9fdd74ac86df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.882171 4764 scope.go:117] "RemoveContainer" containerID="8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.907479 4764 scope.go:117] "RemoveContainer" containerID="1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.909327 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" (UID: "88ba6041-7f8f-48f0-840c-8ea2a9bdc87b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.923078 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" (UID: "be22cbfb-d3e7-43c1-be38-f6fcadeb2c97"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.939388 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v76nj\" (UniqueName: \"kubernetes.io/projected/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-kube-api-access-v76nj\") pod \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.939565 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-utilities\") pod \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.939702 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-catalog-content\") pod \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.939751 4764 scope.go:117] "RemoveContainer" containerID="5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940097 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940122 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940153 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940165 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940177 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff292\" (UniqueName: \"kubernetes.io/projected/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-kube-api-access-ff292\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940188 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4z7x\" (UniqueName: \"kubernetes.io/projected/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-kube-api-access-j4z7x\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940197 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940207 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940313 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s85j9\" (UniqueName: \"kubernetes.io/projected/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-kube-api-access-s85j9\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940326 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940336 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fsvt\" (UniqueName: \"kubernetes.io/projected/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-kube-api-access-5fsvt\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940345 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: E0309 13:28:13.940323 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1\": container with ID starting with 5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1 not found: ID does not exist" containerID="5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940422 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1"} err="failed to get container status \"5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1\": rpc error: code = NotFound desc = could not find container \"5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1\": container with ID starting with 5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1 not found: ID does not exist" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940448 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-utilities" (OuterVolumeSpecName: "utilities") pod "41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" (UID: "41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940472 4764 scope.go:117] "RemoveContainer" containerID="8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2" Mar 09 13:28:13 crc kubenswrapper[4764]: E0309 13:28:13.941275 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2\": container with ID starting with 8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2 not found: ID does not exist" containerID="8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.941331 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2"} err="failed to get container status \"8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2\": rpc error: code = NotFound desc = could not find container \"8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2\": container with ID starting with 8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2 not found: ID does not exist" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.941384 4764 scope.go:117] "RemoveContainer" containerID="1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f" Mar 09 13:28:13 crc kubenswrapper[4764]: E0309 13:28:13.941826 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f\": container with ID starting with 1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f not found: ID does not exist" containerID="1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.941848 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f"} err="failed to get container status \"1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f\": rpc error: code = NotFound desc = could not find container \"1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f\": container with ID starting with 1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f not found: ID does not exist" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.941861 4764 scope.go:117] "RemoveContainer" containerID="4325cf9007df54fa3b2a5bed7103ccce2df4b9e7e47622fe04491c8fac8d1503" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.942676 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-kube-api-access-v76nj" (OuterVolumeSpecName: "kube-api-access-v76nj") pod "41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" (UID: "41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6"). InnerVolumeSpecName "kube-api-access-v76nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.970353 4764 scope.go:117] "RemoveContainer" containerID="0ae328f899ade57662b7a57d61c5864e374b6610f4676d30b76a8c7048f7853c" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.992717 4764 scope.go:117] "RemoveContainer" containerID="ad5ce40518308f615c3da026f6c3fbf2d4af3ef9e2a050d739fa7a496bac2d88" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.011967 4764 scope.go:117] "RemoveContainer" containerID="d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.025061 4764 scope.go:117] "RemoveContainer" containerID="aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.038746 4764 scope.go:117] "RemoveContainer" containerID="05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.042167 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v76nj\" (UniqueName: \"kubernetes.io/projected/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-kube-api-access-v76nj\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.042191 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.053284 4764 scope.go:117] "RemoveContainer" containerID="d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214" Mar 09 13:28:14 crc kubenswrapper[4764]: E0309 13:28:14.053698 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214\": container with ID starting with d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214 not found: ID does not exist" containerID="d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.053736 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214"} err="failed to get container status \"d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214\": rpc error: code = NotFound desc = could not find container \"d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214\": container with ID starting with d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214 not found: ID does not exist" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.053760 4764 scope.go:117] "RemoveContainer" containerID="aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803" Mar 09 13:28:14 crc kubenswrapper[4764]: E0309 13:28:14.054092 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803\": container with ID starting with aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803 not found: ID does not exist" containerID="aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.054128 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803"} err="failed to get container status \"aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803\": rpc error: code = NotFound desc = could not find container \"aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803\": container with ID starting with aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803 not found: ID does not exist" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.054157 4764 scope.go:117] "RemoveContainer" containerID="05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead" Mar 09 13:28:14 crc kubenswrapper[4764]: E0309 13:28:14.054432 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead\": container with ID starting with 05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead not found: ID does not exist" containerID="05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.054461 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead"} err="failed to get container status \"05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead\": rpc error: code = NotFound desc = could not find container \"05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead\": container with ID starting with 05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead not found: ID does not exist" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.068181 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m2gc7"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.085056 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d4gwh"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.089850 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d4gwh"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.094279 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nrc8s"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.098380 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nrc8s"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.117534 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8d627"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.122728 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8d627"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.136952 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" (UID: "41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.142368 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhs57"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.143351 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.155900 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhs57"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.400270 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tll5t"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.406798 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tll5t"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.788310 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" event={"ID":"4351c9fc-c207-4d15-b8a6-f51c0651fe83","Type":"ContainerStarted","Data":"7f19bb86d875ffaf829b3a885986ef08c68ca2f33cd253f50495fb450b0f2897"} Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.788783 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" event={"ID":"4351c9fc-c207-4d15-b8a6-f51c0651fe83","Type":"ContainerStarted","Data":"62146f286e07e7282d321a4b34ab0944d09b1aac7e95a0014dba4e4dc1525adc"} Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.788805 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.791897 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.809671 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" podStartSLOduration=1.809633409 podStartE2EDuration="1.809633409s" podCreationTimestamp="2026-03-09 13:28:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:28:14.808093117 +0000 UTC m=+450.058265045" watchObservedRunningTime="2026-03-09 13:28:14.809633409 +0000 UTC m=+450.059805317" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.211489 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bbwx5"] Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212125 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ccc5b44-95ad-4f4c-8086-c176c41bbd19" containerName="marketplace-operator" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212155 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ccc5b44-95ad-4f4c-8086-c176c41bbd19" containerName="marketplace-operator" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212188 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerName="extract-content" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212197 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerName="extract-content" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212206 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerName="extract-utilities" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212215 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerName="extract-utilities" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212230 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212237 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212253 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerName="extract-utilities" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212260 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerName="extract-utilities" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212268 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212278 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212295 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212303 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212317 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerName="extract-content" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212429 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerName="extract-content" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212445 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerName="extract-content" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212452 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerName="extract-content" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212469 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerName="extract-utilities" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212476 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerName="extract-utilities" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212491 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerName="extract-utilities" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212498 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerName="extract-utilities" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212511 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212517 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212529 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerName="extract-content" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212535 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerName="extract-content" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212793 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212806 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ccc5b44-95ad-4f4c-8086-c176c41bbd19" containerName="marketplace-operator" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212819 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212858 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212873 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.214516 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bbwx5"] Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.214658 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.220143 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.360456 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tldxq\" (UniqueName: \"kubernetes.io/projected/a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa-kube-api-access-tldxq\") pod \"certified-operators-bbwx5\" (UID: \"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa\") " pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.360543 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa-utilities\") pod \"certified-operators-bbwx5\" (UID: \"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa\") " pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.360589 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa-catalog-content\") pod \"certified-operators-bbwx5\" (UID: \"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa\") " pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.461748 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tldxq\" (UniqueName: \"kubernetes.io/projected/a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa-kube-api-access-tldxq\") pod \"certified-operators-bbwx5\" (UID: \"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa\") " pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.463227 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa-utilities\") pod \"certified-operators-bbwx5\" (UID: \"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa\") " pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.463347 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa-catalog-content\") pod \"certified-operators-bbwx5\" (UID: \"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa\") " pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.463909 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa-catalog-content\") pod \"certified-operators-bbwx5\" (UID: \"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa\") " pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.464284 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa-utilities\") pod \"certified-operators-bbwx5\" (UID: \"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa\") " pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.489145 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tldxq\" (UniqueName: \"kubernetes.io/projected/a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa-kube-api-access-tldxq\") pod \"certified-operators-bbwx5\" (UID: \"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa\") " pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.537462 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.569795 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ccc5b44-95ad-4f4c-8086-c176c41bbd19" path="/var/lib/kubelet/pods/1ccc5b44-95ad-4f4c-8086-c176c41bbd19/volumes" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.570877 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" path="/var/lib/kubelet/pods/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6/volumes" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.572067 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" path="/var/lib/kubelet/pods/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df/volumes" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.574118 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" path="/var/lib/kubelet/pods/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b/volumes" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.575251 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" path="/var/lib/kubelet/pods/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97/volumes" Mar 09 13:28:15 crc kubenswrapper[4764]: W0309 13:28:15.756198 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6d68e16_c0d2_4f98_9b3f_d1d392bf67fa.slice/crio-52ed88d09225c736fe314110b647e24374da840281277d81c107d4789f623960 WatchSource:0}: Error finding container 52ed88d09225c736fe314110b647e24374da840281277d81c107d4789f623960: Status 404 returned error can't find the container with id 52ed88d09225c736fe314110b647e24374da840281277d81c107d4789f623960 Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.756897 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bbwx5"] Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.782420 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-whs64"] Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.785829 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.788458 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.790101 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-whs64"] Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.802787 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbwx5" event={"ID":"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa","Type":"ContainerStarted","Data":"52ed88d09225c736fe314110b647e24374da840281277d81c107d4789f623960"} Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.972433 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btsbb\" (UniqueName: \"kubernetes.io/projected/26dd13d2-9d2e-4f59-97a6-e31b76ccf74c-kube-api-access-btsbb\") pod \"redhat-marketplace-whs64\" (UID: \"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c\") " pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.972507 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26dd13d2-9d2e-4f59-97a6-e31b76ccf74c-utilities\") pod \"redhat-marketplace-whs64\" (UID: \"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c\") " pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.972535 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26dd13d2-9d2e-4f59-97a6-e31b76ccf74c-catalog-content\") pod \"redhat-marketplace-whs64\" (UID: \"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c\") " pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.074299 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btsbb\" (UniqueName: \"kubernetes.io/projected/26dd13d2-9d2e-4f59-97a6-e31b76ccf74c-kube-api-access-btsbb\") pod \"redhat-marketplace-whs64\" (UID: \"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c\") " pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.074362 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26dd13d2-9d2e-4f59-97a6-e31b76ccf74c-utilities\") pod \"redhat-marketplace-whs64\" (UID: \"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c\") " pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.074390 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26dd13d2-9d2e-4f59-97a6-e31b76ccf74c-catalog-content\") pod \"redhat-marketplace-whs64\" (UID: \"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c\") " pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.074871 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26dd13d2-9d2e-4f59-97a6-e31b76ccf74c-catalog-content\") pod \"redhat-marketplace-whs64\" (UID: \"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c\") " pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.075048 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26dd13d2-9d2e-4f59-97a6-e31b76ccf74c-utilities\") pod \"redhat-marketplace-whs64\" (UID: \"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c\") " pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.092930 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btsbb\" (UniqueName: \"kubernetes.io/projected/26dd13d2-9d2e-4f59-97a6-e31b76ccf74c-kube-api-access-btsbb\") pod \"redhat-marketplace-whs64\" (UID: \"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c\") " pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.143717 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.317496 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-whs64"] Mar 09 13:28:16 crc kubenswrapper[4764]: W0309 13:28:16.329046 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26dd13d2_9d2e_4f59_97a6_e31b76ccf74c.slice/crio-6ab160557ac97ec94ac2560399e27ade1bfc2a79da5943ef4efa93130b97aae0 WatchSource:0}: Error finding container 6ab160557ac97ec94ac2560399e27ade1bfc2a79da5943ef4efa93130b97aae0: Status 404 returned error can't find the container with id 6ab160557ac97ec94ac2560399e27ade1bfc2a79da5943ef4efa93130b97aae0 Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.805502 4764 generic.go:334] "Generic (PLEG): container finished" podID="26dd13d2-9d2e-4f59-97a6-e31b76ccf74c" containerID="350b7d2111c2f01954e1ea25694c1ad7063253e51b97f678cae864034348ca9d" exitCode=0 Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.806686 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whs64" event={"ID":"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c","Type":"ContainerDied","Data":"350b7d2111c2f01954e1ea25694c1ad7063253e51b97f678cae864034348ca9d"} Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.806714 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whs64" event={"ID":"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c","Type":"ContainerStarted","Data":"6ab160557ac97ec94ac2560399e27ade1bfc2a79da5943ef4efa93130b97aae0"} Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.809951 4764 generic.go:334] "Generic (PLEG): container finished" podID="a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa" containerID="c086db9ee77f8fe876bbaf1f0cf47f42e173bf32d2853cc8ca88a473223d9396" exitCode=0 Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.810827 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbwx5" event={"ID":"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa","Type":"ContainerDied","Data":"c086db9ee77f8fe876bbaf1f0cf47f42e173bf32d2853cc8ca88a473223d9396"} Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.582665 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xn8sz"] Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.585228 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.587960 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.590692 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xn8sz"] Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.698351 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/621cdc4e-d896-4775-b654-2d6606097cb9-catalog-content\") pod \"redhat-operators-xn8sz\" (UID: \"621cdc4e-d896-4775-b654-2d6606097cb9\") " pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.698989 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtpvd\" (UniqueName: \"kubernetes.io/projected/621cdc4e-d896-4775-b654-2d6606097cb9-kube-api-access-jtpvd\") pod \"redhat-operators-xn8sz\" (UID: \"621cdc4e-d896-4775-b654-2d6606097cb9\") " pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.699488 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/621cdc4e-d896-4775-b654-2d6606097cb9-utilities\") pod \"redhat-operators-xn8sz\" (UID: \"621cdc4e-d896-4775-b654-2d6606097cb9\") " pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.801096 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtpvd\" (UniqueName: \"kubernetes.io/projected/621cdc4e-d896-4775-b654-2d6606097cb9-kube-api-access-jtpvd\") pod \"redhat-operators-xn8sz\" (UID: \"621cdc4e-d896-4775-b654-2d6606097cb9\") " pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.801165 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/621cdc4e-d896-4775-b654-2d6606097cb9-utilities\") pod \"redhat-operators-xn8sz\" (UID: \"621cdc4e-d896-4775-b654-2d6606097cb9\") " pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.801206 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/621cdc4e-d896-4775-b654-2d6606097cb9-catalog-content\") pod \"redhat-operators-xn8sz\" (UID: \"621cdc4e-d896-4775-b654-2d6606097cb9\") " pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.801664 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/621cdc4e-d896-4775-b654-2d6606097cb9-utilities\") pod \"redhat-operators-xn8sz\" (UID: \"621cdc4e-d896-4775-b654-2d6606097cb9\") " pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.801742 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/621cdc4e-d896-4775-b654-2d6606097cb9-catalog-content\") pod \"redhat-operators-xn8sz\" (UID: \"621cdc4e-d896-4775-b654-2d6606097cb9\") " pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.831953 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtpvd\" (UniqueName: \"kubernetes.io/projected/621cdc4e-d896-4775-b654-2d6606097cb9-kube-api-access-jtpvd\") pod \"redhat-operators-xn8sz\" (UID: \"621cdc4e-d896-4775-b654-2d6606097cb9\") " pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.837345 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbwx5" event={"ID":"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa","Type":"ContainerStarted","Data":"91f21d1192fbf7233d0ba5261d6dd79a24c5e670d2ce3ce56ee836e0d4154e5b"} Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.858134 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whs64" event={"ID":"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c","Type":"ContainerStarted","Data":"b5083cf386271d338b4c15c21672fafdc012420c0f721902a67a9fc096355808"} Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.909562 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.072057 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xn8sz"] Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.181220 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4sxc8"] Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.182464 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.185667 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.199000 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4sxc8"] Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.307704 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-utilities\") pod \"community-operators-4sxc8\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.307767 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-catalog-content\") pod \"community-operators-4sxc8\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.307908 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p26kd\" (UniqueName: \"kubernetes.io/projected/fa6ff5f6-9328-419b-a996-05bcf478b446-kube-api-access-p26kd\") pod \"community-operators-4sxc8\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.408895 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-catalog-content\") pod \"community-operators-4sxc8\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.409042 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p26kd\" (UniqueName: \"kubernetes.io/projected/fa6ff5f6-9328-419b-a996-05bcf478b446-kube-api-access-p26kd\") pod \"community-operators-4sxc8\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.409112 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-utilities\") pod \"community-operators-4sxc8\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.409471 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-catalog-content\") pod \"community-operators-4sxc8\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.409490 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-utilities\") pod \"community-operators-4sxc8\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.429021 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p26kd\" (UniqueName: \"kubernetes.io/projected/fa6ff5f6-9328-419b-a996-05bcf478b446-kube-api-access-p26kd\") pod \"community-operators-4sxc8\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.583420 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.782081 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4sxc8"] Mar 09 13:28:18 crc kubenswrapper[4764]: W0309 13:28:18.788529 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa6ff5f6_9328_419b_a996_05bcf478b446.slice/crio-fa2faca2b6f361aab53dbf1e7c047f8fa427577595f01228a4ee0b2b2c128cfa WatchSource:0}: Error finding container fa2faca2b6f361aab53dbf1e7c047f8fa427577595f01228a4ee0b2b2c128cfa: Status 404 returned error can't find the container with id fa2faca2b6f361aab53dbf1e7c047f8fa427577595f01228a4ee0b2b2c128cfa Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.865690 4764 generic.go:334] "Generic (PLEG): container finished" podID="621cdc4e-d896-4775-b654-2d6606097cb9" containerID="21b7752d01ed5157b1358caf6432ea4435920cfb27ea11e67bee506887d1aece" exitCode=0 Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.865849 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xn8sz" event={"ID":"621cdc4e-d896-4775-b654-2d6606097cb9","Type":"ContainerDied","Data":"21b7752d01ed5157b1358caf6432ea4435920cfb27ea11e67bee506887d1aece"} Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.866236 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xn8sz" event={"ID":"621cdc4e-d896-4775-b654-2d6606097cb9","Type":"ContainerStarted","Data":"2302eb7ccaa0400069ae2934df34cee77b390ca877fa548b26f505702d7c6bcc"} Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.870529 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbwx5" event={"ID":"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa","Type":"ContainerDied","Data":"91f21d1192fbf7233d0ba5261d6dd79a24c5e670d2ce3ce56ee836e0d4154e5b"} Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.870466 4764 generic.go:334] "Generic (PLEG): container finished" podID="a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa" containerID="91f21d1192fbf7233d0ba5261d6dd79a24c5e670d2ce3ce56ee836e0d4154e5b" exitCode=0 Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.873739 4764 generic.go:334] "Generic (PLEG): container finished" podID="26dd13d2-9d2e-4f59-97a6-e31b76ccf74c" containerID="b5083cf386271d338b4c15c21672fafdc012420c0f721902a67a9fc096355808" exitCode=0 Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.873804 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whs64" event={"ID":"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c","Type":"ContainerDied","Data":"b5083cf386271d338b4c15c21672fafdc012420c0f721902a67a9fc096355808"} Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.875477 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sxc8" event={"ID":"fa6ff5f6-9328-419b-a996-05bcf478b446","Type":"ContainerStarted","Data":"fa2faca2b6f361aab53dbf1e7c047f8fa427577595f01228a4ee0b2b2c128cfa"} Mar 09 13:28:19 crc kubenswrapper[4764]: I0309 13:28:19.883790 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whs64" event={"ID":"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c","Type":"ContainerStarted","Data":"f1d420ae1949fed35e711d47be2d28ad84d0eb2096c4440052f1371c212a3b0b"} Mar 09 13:28:19 crc kubenswrapper[4764]: I0309 13:28:19.887596 4764 generic.go:334] "Generic (PLEG): container finished" podID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerID="dab29cc25a32a534e61bf039add40e1f06d5ae10ca8ef07579cb0bf91b124a9d" exitCode=0 Mar 09 13:28:19 crc kubenswrapper[4764]: I0309 13:28:19.887630 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sxc8" event={"ID":"fa6ff5f6-9328-419b-a996-05bcf478b446","Type":"ContainerDied","Data":"dab29cc25a32a534e61bf039add40e1f06d5ae10ca8ef07579cb0bf91b124a9d"} Mar 09 13:28:19 crc kubenswrapper[4764]: I0309 13:28:19.920333 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-whs64" podStartSLOduration=2.347893043 podStartE2EDuration="4.920298903s" podCreationTimestamp="2026-03-09 13:28:15 +0000 UTC" firstStartedPulling="2026-03-09 13:28:16.808125938 +0000 UTC m=+452.058297846" lastFinishedPulling="2026-03-09 13:28:19.380531788 +0000 UTC m=+454.630703706" observedRunningTime="2026-03-09 13:28:19.913557058 +0000 UTC m=+455.163728976" watchObservedRunningTime="2026-03-09 13:28:19.920298903 +0000 UTC m=+455.170470811" Mar 09 13:28:19 crc kubenswrapper[4764]: I0309 13:28:19.923072 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbwx5" event={"ID":"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa","Type":"ContainerStarted","Data":"3e36f3856d0e3975472304508e0beefe03cae8bac38bfcdfeed65a7529c78eed"} Mar 09 13:28:19 crc kubenswrapper[4764]: I0309 13:28:19.981709 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bbwx5" podStartSLOduration=2.486247766 podStartE2EDuration="4.981692226s" podCreationTimestamp="2026-03-09 13:28:15 +0000 UTC" firstStartedPulling="2026-03-09 13:28:16.811352317 +0000 UTC m=+452.061524215" lastFinishedPulling="2026-03-09 13:28:19.306796777 +0000 UTC m=+454.556968675" observedRunningTime="2026-03-09 13:28:19.979400183 +0000 UTC m=+455.229572101" watchObservedRunningTime="2026-03-09 13:28:19.981692226 +0000 UTC m=+455.231864134" Mar 09 13:28:20 crc kubenswrapper[4764]: I0309 13:28:20.931259 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sxc8" event={"ID":"fa6ff5f6-9328-419b-a996-05bcf478b446","Type":"ContainerStarted","Data":"81f2426c617949ba2e886b6a6713e69e1e2306d5a032ede7615f1cde56b297b4"} Mar 09 13:28:20 crc kubenswrapper[4764]: I0309 13:28:20.933046 4764 generic.go:334] "Generic (PLEG): container finished" podID="621cdc4e-d896-4775-b654-2d6606097cb9" containerID="fb1be1a72f3234dac4f59f874d4e626091a8abe81a68404c7028805d3e2518ea" exitCode=0 Mar 09 13:28:20 crc kubenswrapper[4764]: I0309 13:28:20.933201 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xn8sz" event={"ID":"621cdc4e-d896-4775-b654-2d6606097cb9","Type":"ContainerDied","Data":"fb1be1a72f3234dac4f59f874d4e626091a8abe81a68404c7028805d3e2518ea"} Mar 09 13:28:21 crc kubenswrapper[4764]: I0309 13:28:21.943917 4764 generic.go:334] "Generic (PLEG): container finished" podID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerID="81f2426c617949ba2e886b6a6713e69e1e2306d5a032ede7615f1cde56b297b4" exitCode=0 Mar 09 13:28:21 crc kubenswrapper[4764]: I0309 13:28:21.943993 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sxc8" event={"ID":"fa6ff5f6-9328-419b-a996-05bcf478b446","Type":"ContainerDied","Data":"81f2426c617949ba2e886b6a6713e69e1e2306d5a032ede7615f1cde56b297b4"} Mar 09 13:28:21 crc kubenswrapper[4764]: I0309 13:28:21.949046 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xn8sz" event={"ID":"621cdc4e-d896-4775-b654-2d6606097cb9","Type":"ContainerStarted","Data":"678fe7f391aa999be0810d250bf35094d5e83712c163ecc09372d0f1c2a64457"} Mar 09 13:28:21 crc kubenswrapper[4764]: I0309 13:28:21.999716 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xn8sz" podStartSLOduration=2.538070135 podStartE2EDuration="4.999633884s" podCreationTimestamp="2026-03-09 13:28:17 +0000 UTC" firstStartedPulling="2026-03-09 13:28:18.868705289 +0000 UTC m=+454.118877197" lastFinishedPulling="2026-03-09 13:28:21.330269038 +0000 UTC m=+456.580440946" observedRunningTime="2026-03-09 13:28:21.997091207 +0000 UTC m=+457.247263125" watchObservedRunningTime="2026-03-09 13:28:21.999633884 +0000 UTC m=+457.249805802" Mar 09 13:28:22 crc kubenswrapper[4764]: I0309 13:28:22.961595 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sxc8" event={"ID":"fa6ff5f6-9328-419b-a996-05bcf478b446","Type":"ContainerStarted","Data":"cbaaee31f94d5b978452176039ab59d2520f7f3d0d136e8695a1528c41a1e96e"} Mar 09 13:28:22 crc kubenswrapper[4764]: I0309 13:28:22.984958 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4sxc8" podStartSLOduration=2.5611369440000002 podStartE2EDuration="4.984937949s" podCreationTimestamp="2026-03-09 13:28:18 +0000 UTC" firstStartedPulling="2026-03-09 13:28:19.889443397 +0000 UTC m=+455.139615305" lastFinishedPulling="2026-03-09 13:28:22.313244352 +0000 UTC m=+457.563416310" observedRunningTime="2026-03-09 13:28:22.984748374 +0000 UTC m=+458.234920292" watchObservedRunningTime="2026-03-09 13:28:22.984937949 +0000 UTC m=+458.235109857" Mar 09 13:28:25 crc kubenswrapper[4764]: I0309 13:28:25.538606 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:25 crc kubenswrapper[4764]: I0309 13:28:25.539117 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:25 crc kubenswrapper[4764]: I0309 13:28:25.600540 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:26 crc kubenswrapper[4764]: I0309 13:28:26.025274 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:26 crc kubenswrapper[4764]: I0309 13:28:26.144236 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:26 crc kubenswrapper[4764]: I0309 13:28:26.144680 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:26 crc kubenswrapper[4764]: I0309 13:28:26.181124 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:27 crc kubenswrapper[4764]: I0309 13:28:27.036397 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:27 crc kubenswrapper[4764]: I0309 13:28:27.668738 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:28:27 crc kubenswrapper[4764]: I0309 13:28:27.669186 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:28:27 crc kubenswrapper[4764]: I0309 13:28:27.670316 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:28:27 crc kubenswrapper[4764]: I0309 13:28:27.678025 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:28:27 crc kubenswrapper[4764]: I0309 13:28:27.760332 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:28:27 crc kubenswrapper[4764]: I0309 13:28:27.910166 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:27 crc kubenswrapper[4764]: I0309 13:28:27.910257 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:27 crc kubenswrapper[4764]: I0309 13:28:27.957581 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:28 crc kubenswrapper[4764]: W0309 13:28:28.004554 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-c587b0b9f778265ef0d6ecfa8659f9de6da5b53a8719e98811d74683bc3fdf0c WatchSource:0}: Error finding container c587b0b9f778265ef0d6ecfa8659f9de6da5b53a8719e98811d74683bc3fdf0c: Status 404 returned error can't find the container with id c587b0b9f778265ef0d6ecfa8659f9de6da5b53a8719e98811d74683bc3fdf0c Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.041896 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.370368 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.370468 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.583523 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.583572 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.629476 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.684091 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.684192 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.691120 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.691134 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.860858 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.960284 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:28:29 crc kubenswrapper[4764]: I0309 13:28:29.015101 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a84715ddddcc93a14a8cb4cf3872420ec468facb9923c9290e93bd3c37b10b31"} Mar 09 13:28:29 crc kubenswrapper[4764]: I0309 13:28:29.015173 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c587b0b9f778265ef0d6ecfa8659f9de6da5b53a8719e98811d74683bc3fdf0c"} Mar 09 13:28:29 crc kubenswrapper[4764]: I0309 13:28:29.095552 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:29 crc kubenswrapper[4764]: W0309 13:28:29.330466 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-6c8947a1f915848c48c6c0419f75f88ef3a66ca240e783f2db1dc846f908de65 WatchSource:0}: Error finding container 6c8947a1f915848c48c6c0419f75f88ef3a66ca240e783f2db1dc846f908de65: Status 404 returned error can't find the container with id 6c8947a1f915848c48c6c0419f75f88ef3a66ca240e783f2db1dc846f908de65 Mar 09 13:28:30 crc kubenswrapper[4764]: I0309 13:28:30.022276 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0fd4770d347eabdfc31bf15a33ee5028197544bb396e1ae5fb89adaef69c34d5"} Mar 09 13:28:30 crc kubenswrapper[4764]: I0309 13:28:30.022766 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6c8947a1f915848c48c6c0419f75f88ef3a66ca240e783f2db1dc846f908de65"} Mar 09 13:28:30 crc kubenswrapper[4764]: I0309 13:28:30.025690 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ec57b55e16d15942e8a8d592497f69bea44f72b5da852d3582dba57da9a2033c"} Mar 09 13:28:30 crc kubenswrapper[4764]: I0309 13:28:30.025721 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"096db084c373c866c7fb651144d795fa04cc36153c4327aba08cace8e3c46039"} Mar 09 13:28:30 crc kubenswrapper[4764]: I0309 13:28:30.026057 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:28:58 crc kubenswrapper[4764]: I0309 13:28:58.370805 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:28:58 crc kubenswrapper[4764]: I0309 13:28:58.371455 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:28:58 crc kubenswrapper[4764]: I0309 13:28:58.371525 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:28:58 crc kubenswrapper[4764]: I0309 13:28:58.372834 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8bb39ae24112881ead01c36905f25d69cb36d1e4d3c6e8aa79f283e7dd0d444b"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:28:58 crc kubenswrapper[4764]: I0309 13:28:58.372919 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://8bb39ae24112881ead01c36905f25d69cb36d1e4d3c6e8aa79f283e7dd0d444b" gracePeriod=600 Mar 09 13:28:59 crc kubenswrapper[4764]: I0309 13:28:59.212370 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="8bb39ae24112881ead01c36905f25d69cb36d1e4d3c6e8aa79f283e7dd0d444b" exitCode=0 Mar 09 13:28:59 crc kubenswrapper[4764]: I0309 13:28:59.212439 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"8bb39ae24112881ead01c36905f25d69cb36d1e4d3c6e8aa79f283e7dd0d444b"} Mar 09 13:28:59 crc kubenswrapper[4764]: I0309 13:28:59.213320 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"6be6253041773ccdd98bbdfbbb5e4447fbae62fa99ca2320328889b4671f6c14"} Mar 09 13:28:59 crc kubenswrapper[4764]: I0309 13:28:59.213351 4764 scope.go:117] "RemoveContainer" containerID="a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad" Mar 09 13:29:08 crc kubenswrapper[4764]: I0309 13:29:08.870566 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.142586 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551050-zmlhn"] Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.143895 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551050-zmlhn" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.147597 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.147737 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp"] Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.148001 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.148808 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.149310 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551050-zmlhn"] Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.149604 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.150676 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.151047 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.153379 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp"] Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.272350 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtshq\" (UniqueName: \"kubernetes.io/projected/7f815cd5-462f-4994-bab1-beef4157b06e-kube-api-access-wtshq\") pod \"auto-csr-approver-29551050-zmlhn\" (UID: \"7f815cd5-462f-4994-bab1-beef4157b06e\") " pod="openshift-infra/auto-csr-approver-29551050-zmlhn" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.272446 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a7784d6-384a-426a-8c7f-17738461c327-config-volume\") pod \"collect-profiles-29551050-9wvsp\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.272477 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a7784d6-384a-426a-8c7f-17738461c327-secret-volume\") pod \"collect-profiles-29551050-9wvsp\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.272499 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwwtz\" (UniqueName: \"kubernetes.io/projected/1a7784d6-384a-426a-8c7f-17738461c327-kube-api-access-kwwtz\") pod \"collect-profiles-29551050-9wvsp\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.373588 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a7784d6-384a-426a-8c7f-17738461c327-config-volume\") pod \"collect-profiles-29551050-9wvsp\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.373678 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a7784d6-384a-426a-8c7f-17738461c327-secret-volume\") pod \"collect-profiles-29551050-9wvsp\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.373712 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwwtz\" (UniqueName: \"kubernetes.io/projected/1a7784d6-384a-426a-8c7f-17738461c327-kube-api-access-kwwtz\") pod \"collect-profiles-29551050-9wvsp\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.373739 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtshq\" (UniqueName: \"kubernetes.io/projected/7f815cd5-462f-4994-bab1-beef4157b06e-kube-api-access-wtshq\") pod \"auto-csr-approver-29551050-zmlhn\" (UID: \"7f815cd5-462f-4994-bab1-beef4157b06e\") " pod="openshift-infra/auto-csr-approver-29551050-zmlhn" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.374826 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a7784d6-384a-426a-8c7f-17738461c327-config-volume\") pod \"collect-profiles-29551050-9wvsp\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.382960 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a7784d6-384a-426a-8c7f-17738461c327-secret-volume\") pod \"collect-profiles-29551050-9wvsp\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.392432 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtshq\" (UniqueName: \"kubernetes.io/projected/7f815cd5-462f-4994-bab1-beef4157b06e-kube-api-access-wtshq\") pod \"auto-csr-approver-29551050-zmlhn\" (UID: \"7f815cd5-462f-4994-bab1-beef4157b06e\") " pod="openshift-infra/auto-csr-approver-29551050-zmlhn" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.392739 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwwtz\" (UniqueName: \"kubernetes.io/projected/1a7784d6-384a-426a-8c7f-17738461c327-kube-api-access-kwwtz\") pod \"collect-profiles-29551050-9wvsp\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.477862 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551050-zmlhn" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.496328 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.773575 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551050-zmlhn"] Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.788124 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.933212 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp"] Mar 09 13:30:00 crc kubenswrapper[4764]: W0309 13:30:00.939442 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a7784d6_384a_426a_8c7f_17738461c327.slice/crio-bd22d72b8374d245cd7927a10057d1dc10b1552a1c9306e29d1f5b840279060d WatchSource:0}: Error finding container bd22d72b8374d245cd7927a10057d1dc10b1552a1c9306e29d1f5b840279060d: Status 404 returned error can't find the container with id bd22d72b8374d245cd7927a10057d1dc10b1552a1c9306e29d1f5b840279060d Mar 09 13:30:01 crc kubenswrapper[4764]: I0309 13:30:01.743512 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551050-zmlhn" event={"ID":"7f815cd5-462f-4994-bab1-beef4157b06e","Type":"ContainerStarted","Data":"fee757faf62e596d9a24630a4cb2fe2b56b3f58f6cb88a77fee32c0a8f6e491c"} Mar 09 13:30:01 crc kubenswrapper[4764]: I0309 13:30:01.744359 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" event={"ID":"1a7784d6-384a-426a-8c7f-17738461c327","Type":"ContainerStarted","Data":"bd22d72b8374d245cd7927a10057d1dc10b1552a1c9306e29d1f5b840279060d"} Mar 09 13:30:02 crc kubenswrapper[4764]: I0309 13:30:02.752018 4764 generic.go:334] "Generic (PLEG): container finished" podID="7f815cd5-462f-4994-bab1-beef4157b06e" containerID="a7ac41644a3901488ef405782554d6dd08becac720d51b607ff6a4cba78e912f" exitCode=0 Mar 09 13:30:02 crc kubenswrapper[4764]: I0309 13:30:02.752094 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551050-zmlhn" event={"ID":"7f815cd5-462f-4994-bab1-beef4157b06e","Type":"ContainerDied","Data":"a7ac41644a3901488ef405782554d6dd08becac720d51b607ff6a4cba78e912f"} Mar 09 13:30:02 crc kubenswrapper[4764]: I0309 13:30:02.754082 4764 generic.go:334] "Generic (PLEG): container finished" podID="1a7784d6-384a-426a-8c7f-17738461c327" containerID="aa846508ebc812b1aaea2ee2e48b6017bd527c31a06deb61dd1001786fb1b811" exitCode=0 Mar 09 13:30:02 crc kubenswrapper[4764]: I0309 13:30:02.754128 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" event={"ID":"1a7784d6-384a-426a-8c7f-17738461c327","Type":"ContainerDied","Data":"aa846508ebc812b1aaea2ee2e48b6017bd527c31a06deb61dd1001786fb1b811"} Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.083160 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.088226 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551050-zmlhn" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.226565 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a7784d6-384a-426a-8c7f-17738461c327-config-volume\") pod \"1a7784d6-384a-426a-8c7f-17738461c327\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.226674 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtshq\" (UniqueName: \"kubernetes.io/projected/7f815cd5-462f-4994-bab1-beef4157b06e-kube-api-access-wtshq\") pod \"7f815cd5-462f-4994-bab1-beef4157b06e\" (UID: \"7f815cd5-462f-4994-bab1-beef4157b06e\") " Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.226740 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwwtz\" (UniqueName: \"kubernetes.io/projected/1a7784d6-384a-426a-8c7f-17738461c327-kube-api-access-kwwtz\") pod \"1a7784d6-384a-426a-8c7f-17738461c327\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.226902 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a7784d6-384a-426a-8c7f-17738461c327-secret-volume\") pod \"1a7784d6-384a-426a-8c7f-17738461c327\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.227612 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a7784d6-384a-426a-8c7f-17738461c327-config-volume" (OuterVolumeSpecName: "config-volume") pod "1a7784d6-384a-426a-8c7f-17738461c327" (UID: "1a7784d6-384a-426a-8c7f-17738461c327"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.233528 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a7784d6-384a-426a-8c7f-17738461c327-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1a7784d6-384a-426a-8c7f-17738461c327" (UID: "1a7784d6-384a-426a-8c7f-17738461c327"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.233967 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a7784d6-384a-426a-8c7f-17738461c327-kube-api-access-kwwtz" (OuterVolumeSpecName: "kube-api-access-kwwtz") pod "1a7784d6-384a-426a-8c7f-17738461c327" (UID: "1a7784d6-384a-426a-8c7f-17738461c327"). InnerVolumeSpecName "kube-api-access-kwwtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.240858 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f815cd5-462f-4994-bab1-beef4157b06e-kube-api-access-wtshq" (OuterVolumeSpecName: "kube-api-access-wtshq") pod "7f815cd5-462f-4994-bab1-beef4157b06e" (UID: "7f815cd5-462f-4994-bab1-beef4157b06e"). InnerVolumeSpecName "kube-api-access-wtshq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.327965 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a7784d6-384a-426a-8c7f-17738461c327-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.328004 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtshq\" (UniqueName: \"kubernetes.io/projected/7f815cd5-462f-4994-bab1-beef4157b06e-kube-api-access-wtshq\") on node \"crc\" DevicePath \"\"" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.328015 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwwtz\" (UniqueName: \"kubernetes.io/projected/1a7784d6-384a-426a-8c7f-17738461c327-kube-api-access-kwwtz\") on node \"crc\" DevicePath \"\"" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.328024 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a7784d6-384a-426a-8c7f-17738461c327-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.771187 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551050-zmlhn" event={"ID":"7f815cd5-462f-4994-bab1-beef4157b06e","Type":"ContainerDied","Data":"fee757faf62e596d9a24630a4cb2fe2b56b3f58f6cb88a77fee32c0a8f6e491c"} Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.771237 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fee757faf62e596d9a24630a4cb2fe2b56b3f58f6cb88a77fee32c0a8f6e491c" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.771347 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551050-zmlhn" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.773546 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" event={"ID":"1a7784d6-384a-426a-8c7f-17738461c327","Type":"ContainerDied","Data":"bd22d72b8374d245cd7927a10057d1dc10b1552a1c9306e29d1f5b840279060d"} Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.773611 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.773640 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd22d72b8374d245cd7927a10057d1dc10b1552a1c9306e29d1f5b840279060d" Mar 09 13:30:05 crc kubenswrapper[4764]: I0309 13:30:05.177769 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551044-p748f"] Mar 09 13:30:05 crc kubenswrapper[4764]: I0309 13:30:05.183325 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551044-p748f"] Mar 09 13:30:05 crc kubenswrapper[4764]: I0309 13:30:05.572349 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a005f65-920a-4cdd-b4da-a270953113aa" path="/var/lib/kubelet/pods/0a005f65-920a-4cdd-b4da-a270953113aa/volumes" Mar 09 13:30:58 crc kubenswrapper[4764]: I0309 13:30:58.370008 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:30:58 crc kubenswrapper[4764]: I0309 13:30:58.370419 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:31:28 crc kubenswrapper[4764]: I0309 13:31:28.370636 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:31:28 crc kubenswrapper[4764]: I0309 13:31:28.371756 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:31:49 crc kubenswrapper[4764]: I0309 13:31:49.333395 4764 scope.go:117] "RemoveContainer" containerID="3ce66a9ae238a55280ff899673763223d028dea40122d545386701984ba576ae" Mar 09 13:31:58 crc kubenswrapper[4764]: I0309 13:31:58.370572 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:31:58 crc kubenswrapper[4764]: I0309 13:31:58.371638 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:31:58 crc kubenswrapper[4764]: I0309 13:31:58.371767 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:31:58 crc kubenswrapper[4764]: I0309 13:31:58.372758 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6be6253041773ccdd98bbdfbbb5e4447fbae62fa99ca2320328889b4671f6c14"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:31:58 crc kubenswrapper[4764]: I0309 13:31:58.372891 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://6be6253041773ccdd98bbdfbbb5e4447fbae62fa99ca2320328889b4671f6c14" gracePeriod=600 Mar 09 13:31:58 crc kubenswrapper[4764]: I0309 13:31:58.585954 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="6be6253041773ccdd98bbdfbbb5e4447fbae62fa99ca2320328889b4671f6c14" exitCode=0 Mar 09 13:31:58 crc kubenswrapper[4764]: I0309 13:31:58.586022 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"6be6253041773ccdd98bbdfbbb5e4447fbae62fa99ca2320328889b4671f6c14"} Mar 09 13:31:58 crc kubenswrapper[4764]: I0309 13:31:58.586156 4764 scope.go:117] "RemoveContainer" containerID="8bb39ae24112881ead01c36905f25d69cb36d1e4d3c6e8aa79f283e7dd0d444b" Mar 09 13:31:59 crc kubenswrapper[4764]: I0309 13:31:59.596831 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"3e004d46e664a823c675c177e537cdd2b21dcfc6ff9afcb1b390da1939340817"} Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.147724 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551052-t652n"] Mar 09 13:32:00 crc kubenswrapper[4764]: E0309 13:32:00.148028 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a7784d6-384a-426a-8c7f-17738461c327" containerName="collect-profiles" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.148047 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a7784d6-384a-426a-8c7f-17738461c327" containerName="collect-profiles" Mar 09 13:32:00 crc kubenswrapper[4764]: E0309 13:32:00.148067 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f815cd5-462f-4994-bab1-beef4157b06e" containerName="oc" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.148075 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f815cd5-462f-4994-bab1-beef4157b06e" containerName="oc" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.148219 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a7784d6-384a-426a-8c7f-17738461c327" containerName="collect-profiles" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.148235 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f815cd5-462f-4994-bab1-beef4157b06e" containerName="oc" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.148772 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551052-t652n" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.151934 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.151989 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.152086 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.186898 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551052-t652n"] Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.194229 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b66fj\" (UniqueName: \"kubernetes.io/projected/ee50d407-01a6-43e7-833e-b803dbb4792f-kube-api-access-b66fj\") pod \"auto-csr-approver-29551052-t652n\" (UID: \"ee50d407-01a6-43e7-833e-b803dbb4792f\") " pod="openshift-infra/auto-csr-approver-29551052-t652n" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.295366 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b66fj\" (UniqueName: \"kubernetes.io/projected/ee50d407-01a6-43e7-833e-b803dbb4792f-kube-api-access-b66fj\") pod \"auto-csr-approver-29551052-t652n\" (UID: \"ee50d407-01a6-43e7-833e-b803dbb4792f\") " pod="openshift-infra/auto-csr-approver-29551052-t652n" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.317070 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b66fj\" (UniqueName: \"kubernetes.io/projected/ee50d407-01a6-43e7-833e-b803dbb4792f-kube-api-access-b66fj\") pod \"auto-csr-approver-29551052-t652n\" (UID: \"ee50d407-01a6-43e7-833e-b803dbb4792f\") " pod="openshift-infra/auto-csr-approver-29551052-t652n" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.490017 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551052-t652n" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.738677 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551052-t652n"] Mar 09 13:32:01 crc kubenswrapper[4764]: I0309 13:32:01.621488 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551052-t652n" event={"ID":"ee50d407-01a6-43e7-833e-b803dbb4792f","Type":"ContainerStarted","Data":"11bea6b1c69c2fe5f00de917bcf3bd6c5542470358427980ac9d3ff037bf4fb1"} Mar 09 13:32:02 crc kubenswrapper[4764]: I0309 13:32:02.633754 4764 generic.go:334] "Generic (PLEG): container finished" podID="ee50d407-01a6-43e7-833e-b803dbb4792f" containerID="492bc9f6bc85937ff3b8aee6d3f29e3e150f1bc426852bf9cdf6e5878286e321" exitCode=0 Mar 09 13:32:02 crc kubenswrapper[4764]: I0309 13:32:02.633973 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551052-t652n" event={"ID":"ee50d407-01a6-43e7-833e-b803dbb4792f","Type":"ContainerDied","Data":"492bc9f6bc85937ff3b8aee6d3f29e3e150f1bc426852bf9cdf6e5878286e321"} Mar 09 13:32:03 crc kubenswrapper[4764]: I0309 13:32:03.875115 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551052-t652n" Mar 09 13:32:03 crc kubenswrapper[4764]: I0309 13:32:03.946633 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b66fj\" (UniqueName: \"kubernetes.io/projected/ee50d407-01a6-43e7-833e-b803dbb4792f-kube-api-access-b66fj\") pod \"ee50d407-01a6-43e7-833e-b803dbb4792f\" (UID: \"ee50d407-01a6-43e7-833e-b803dbb4792f\") " Mar 09 13:32:03 crc kubenswrapper[4764]: I0309 13:32:03.954162 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee50d407-01a6-43e7-833e-b803dbb4792f-kube-api-access-b66fj" (OuterVolumeSpecName: "kube-api-access-b66fj") pod "ee50d407-01a6-43e7-833e-b803dbb4792f" (UID: "ee50d407-01a6-43e7-833e-b803dbb4792f"). InnerVolumeSpecName "kube-api-access-b66fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:32:04 crc kubenswrapper[4764]: I0309 13:32:04.047600 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b66fj\" (UniqueName: \"kubernetes.io/projected/ee50d407-01a6-43e7-833e-b803dbb4792f-kube-api-access-b66fj\") on node \"crc\" DevicePath \"\"" Mar 09 13:32:04 crc kubenswrapper[4764]: I0309 13:32:04.658501 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551052-t652n" event={"ID":"ee50d407-01a6-43e7-833e-b803dbb4792f","Type":"ContainerDied","Data":"11bea6b1c69c2fe5f00de917bcf3bd6c5542470358427980ac9d3ff037bf4fb1"} Mar 09 13:32:04 crc kubenswrapper[4764]: I0309 13:32:04.658869 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11bea6b1c69c2fe5f00de917bcf3bd6c5542470358427980ac9d3ff037bf4fb1" Mar 09 13:32:04 crc kubenswrapper[4764]: I0309 13:32:04.659273 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551052-t652n" Mar 09 13:32:04 crc kubenswrapper[4764]: I0309 13:32:04.954209 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551046-ll87d"] Mar 09 13:32:04 crc kubenswrapper[4764]: I0309 13:32:04.958951 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551046-ll87d"] Mar 09 13:32:05 crc kubenswrapper[4764]: I0309 13:32:05.571135 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c09230f9-b117-44a0-b3ed-ab6dc7ce0285" path="/var/lib/kubelet/pods/c09230f9-b117-44a0-b3ed-ab6dc7ce0285/volumes" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.163236 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mfcng"] Mar 09 13:32:12 crc kubenswrapper[4764]: E0309 13:32:12.164467 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee50d407-01a6-43e7-833e-b803dbb4792f" containerName="oc" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.164484 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee50d407-01a6-43e7-833e-b803dbb4792f" containerName="oc" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.164615 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee50d407-01a6-43e7-833e-b803dbb4792f" containerName="oc" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.165212 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.186850 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mfcng"] Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.272547 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d889004a-dd34-46e9-ad61-d5bfb627ca16-registry-tls\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.272620 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.272706 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d889004a-dd34-46e9-ad61-d5bfb627ca16-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.272738 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pms6q\" (UniqueName: \"kubernetes.io/projected/d889004a-dd34-46e9-ad61-d5bfb627ca16-kube-api-access-pms6q\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.272766 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d889004a-dd34-46e9-ad61-d5bfb627ca16-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.272790 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d889004a-dd34-46e9-ad61-d5bfb627ca16-trusted-ca\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.272824 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d889004a-dd34-46e9-ad61-d5bfb627ca16-registry-certificates\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.272930 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d889004a-dd34-46e9-ad61-d5bfb627ca16-bound-sa-token\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.298763 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.374114 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d889004a-dd34-46e9-ad61-d5bfb627ca16-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.374180 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d889004a-dd34-46e9-ad61-d5bfb627ca16-trusted-ca\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.374222 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d889004a-dd34-46e9-ad61-d5bfb627ca16-registry-certificates\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.374250 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d889004a-dd34-46e9-ad61-d5bfb627ca16-bound-sa-token\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.374293 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d889004a-dd34-46e9-ad61-d5bfb627ca16-registry-tls\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.374343 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d889004a-dd34-46e9-ad61-d5bfb627ca16-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.374379 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pms6q\" (UniqueName: \"kubernetes.io/projected/d889004a-dd34-46e9-ad61-d5bfb627ca16-kube-api-access-pms6q\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.375736 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d889004a-dd34-46e9-ad61-d5bfb627ca16-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.376243 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d889004a-dd34-46e9-ad61-d5bfb627ca16-registry-certificates\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.376460 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d889004a-dd34-46e9-ad61-d5bfb627ca16-trusted-ca\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.380980 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d889004a-dd34-46e9-ad61-d5bfb627ca16-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.384741 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d889004a-dd34-46e9-ad61-d5bfb627ca16-registry-tls\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.391986 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d889004a-dd34-46e9-ad61-d5bfb627ca16-bound-sa-token\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.392279 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pms6q\" (UniqueName: \"kubernetes.io/projected/d889004a-dd34-46e9-ad61-d5bfb627ca16-kube-api-access-pms6q\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.482663 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.702398 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mfcng"] Mar 09 13:32:12 crc kubenswrapper[4764]: W0309 13:32:12.707287 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd889004a_dd34_46e9_ad61_d5bfb627ca16.slice/crio-7985ecb6085e5782c562962cad926d710f5b5e1293068580f4a4447c44f61bce WatchSource:0}: Error finding container 7985ecb6085e5782c562962cad926d710f5b5e1293068580f4a4447c44f61bce: Status 404 returned error can't find the container with id 7985ecb6085e5782c562962cad926d710f5b5e1293068580f4a4447c44f61bce Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.718185 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" event={"ID":"d889004a-dd34-46e9-ad61-d5bfb627ca16","Type":"ContainerStarted","Data":"7985ecb6085e5782c562962cad926d710f5b5e1293068580f4a4447c44f61bce"} Mar 09 13:32:13 crc kubenswrapper[4764]: I0309 13:32:13.727862 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" event={"ID":"d889004a-dd34-46e9-ad61-d5bfb627ca16","Type":"ContainerStarted","Data":"724b8184d1dc96d6b624e07c336ea0f919ff1107762c45217d5416879d550b7d"} Mar 09 13:32:13 crc kubenswrapper[4764]: I0309 13:32:13.728446 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:13 crc kubenswrapper[4764]: I0309 13:32:13.759500 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" podStartSLOduration=1.759465625 podStartE2EDuration="1.759465625s" podCreationTimestamp="2026-03-09 13:32:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:32:13.755816597 +0000 UTC m=+689.005988585" watchObservedRunningTime="2026-03-09 13:32:13.759465625 +0000 UTC m=+689.009637583" Mar 09 13:32:32 crc kubenswrapper[4764]: I0309 13:32:32.487868 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:32 crc kubenswrapper[4764]: I0309 13:32:32.562898 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-w49hn"] Mar 09 13:32:49 crc kubenswrapper[4764]: I0309 13:32:49.409103 4764 scope.go:117] "RemoveContainer" containerID="bde3db10cbf6b95804c8c69e0b70c34f476ab92c998e4a7ae6079e0e0e9d7b98" Mar 09 13:32:57 crc kubenswrapper[4764]: I0309 13:32:57.627671 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" podUID="d3652fe0-4889-432f-af3f-787dd19c60d6" containerName="registry" containerID="cri-o://d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4" gracePeriod=30 Mar 09 13:32:57 crc kubenswrapper[4764]: I0309 13:32:57.981714 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.049416 4764 generic.go:334] "Generic (PLEG): container finished" podID="d3652fe0-4889-432f-af3f-787dd19c60d6" containerID="d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4" exitCode=0 Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.049452 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" event={"ID":"d3652fe0-4889-432f-af3f-787dd19c60d6","Type":"ContainerDied","Data":"d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4"} Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.049481 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.049499 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" event={"ID":"d3652fe0-4889-432f-af3f-787dd19c60d6","Type":"ContainerDied","Data":"f63eb7186d86d6bf6062656b177ec8adc1e47100bcb490f0e226ebe524f4ea53"} Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.049514 4764 scope.go:117] "RemoveContainer" containerID="d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.066917 4764 scope.go:117] "RemoveContainer" containerID="d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4" Mar 09 13:32:58 crc kubenswrapper[4764]: E0309 13:32:58.067268 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4\": container with ID starting with d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4 not found: ID does not exist" containerID="d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.067311 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4"} err="failed to get container status \"d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4\": rpc error: code = NotFound desc = could not find container \"d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4\": container with ID starting with d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4 not found: ID does not exist" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.137485 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-bound-sa-token\") pod \"d3652fe0-4889-432f-af3f-787dd19c60d6\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.137889 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d3652fe0-4889-432f-af3f-787dd19c60d6-ca-trust-extracted\") pod \"d3652fe0-4889-432f-af3f-787dd19c60d6\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.138053 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"d3652fe0-4889-432f-af3f-787dd19c60d6\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.138081 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-certificates\") pod \"d3652fe0-4889-432f-af3f-787dd19c60d6\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.138185 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-trusted-ca\") pod \"d3652fe0-4889-432f-af3f-787dd19c60d6\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.138215 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d3652fe0-4889-432f-af3f-787dd19c60d6-installation-pull-secrets\") pod \"d3652fe0-4889-432f-af3f-787dd19c60d6\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.138234 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-tls\") pod \"d3652fe0-4889-432f-af3f-787dd19c60d6\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.138280 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsz44\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-kube-api-access-xsz44\") pod \"d3652fe0-4889-432f-af3f-787dd19c60d6\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.138864 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d3652fe0-4889-432f-af3f-787dd19c60d6" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.139183 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d3652fe0-4889-432f-af3f-787dd19c60d6" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.144487 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d3652fe0-4889-432f-af3f-787dd19c60d6" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.144513 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d3652fe0-4889-432f-af3f-787dd19c60d6" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.144570 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3652fe0-4889-432f-af3f-787dd19c60d6-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d3652fe0-4889-432f-af3f-787dd19c60d6" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.144854 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-kube-api-access-xsz44" (OuterVolumeSpecName: "kube-api-access-xsz44") pod "d3652fe0-4889-432f-af3f-787dd19c60d6" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6"). InnerVolumeSpecName "kube-api-access-xsz44". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.151631 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "d3652fe0-4889-432f-af3f-787dd19c60d6" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.155599 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3652fe0-4889-432f-af3f-787dd19c60d6-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d3652fe0-4889-432f-af3f-787dd19c60d6" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.239446 4764 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.239485 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsz44\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-kube-api-access-xsz44\") on node \"crc\" DevicePath \"\"" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.239497 4764 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.239506 4764 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d3652fe0-4889-432f-af3f-787dd19c60d6-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.239515 4764 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.239523 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.239531 4764 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d3652fe0-4889-432f-af3f-787dd19c60d6-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.378438 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-w49hn"] Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.382296 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-w49hn"] Mar 09 13:32:59 crc kubenswrapper[4764]: I0309 13:32:59.568177 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3652fe0-4889-432f-af3f-787dd19c60d6" path="/var/lib/kubelet/pods/d3652fe0-4889-432f-af3f-787dd19c60d6/volumes" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.150771 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv"] Mar 09 13:33:34 crc kubenswrapper[4764]: E0309 13:33:34.151631 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3652fe0-4889-432f-af3f-787dd19c60d6" containerName="registry" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.151660 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3652fe0-4889-432f-af3f-787dd19c60d6" containerName="registry" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.151753 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3652fe0-4889-432f-af3f-787dd19c60d6" containerName="registry" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.152170 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.154413 4764 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-24l57" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.154417 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.158500 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv"] Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.158636 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.170706 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-j8rlp"] Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.171471 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-j8rlp" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.173301 4764 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-4tlmc" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.177004 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-lhpfw"] Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.177887 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-lhpfw" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.179953 4764 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-q2chc" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.189467 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-lhpfw"] Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.198754 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-j8rlp"] Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.349274 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxtwr\" (UniqueName: \"kubernetes.io/projected/09aeffa2-590d-4062-95ff-40dbdda54df7-kube-api-access-fxtwr\") pod \"cert-manager-webhook-687f57d79b-j8rlp\" (UID: \"09aeffa2-590d-4062-95ff-40dbdda54df7\") " pod="cert-manager/cert-manager-webhook-687f57d79b-j8rlp" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.349423 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcdfn\" (UniqueName: \"kubernetes.io/projected/0a35f012-3965-4680-aa01-9fa97f956c68-kube-api-access-jcdfn\") pod \"cert-manager-cainjector-cf98fcc89-qr7fv\" (UID: \"0a35f012-3965-4680-aa01-9fa97f956c68\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.349735 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfg2w\" (UniqueName: \"kubernetes.io/projected/2eef62f2-5973-47e2-b921-9e1a05b9f8fb-kube-api-access-qfg2w\") pod \"cert-manager-858654f9db-lhpfw\" (UID: \"2eef62f2-5973-47e2-b921-9e1a05b9f8fb\") " pod="cert-manager/cert-manager-858654f9db-lhpfw" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.450746 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfg2w\" (UniqueName: \"kubernetes.io/projected/2eef62f2-5973-47e2-b921-9e1a05b9f8fb-kube-api-access-qfg2w\") pod \"cert-manager-858654f9db-lhpfw\" (UID: \"2eef62f2-5973-47e2-b921-9e1a05b9f8fb\") " pod="cert-manager/cert-manager-858654f9db-lhpfw" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.450825 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxtwr\" (UniqueName: \"kubernetes.io/projected/09aeffa2-590d-4062-95ff-40dbdda54df7-kube-api-access-fxtwr\") pod \"cert-manager-webhook-687f57d79b-j8rlp\" (UID: \"09aeffa2-590d-4062-95ff-40dbdda54df7\") " pod="cert-manager/cert-manager-webhook-687f57d79b-j8rlp" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.450895 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcdfn\" (UniqueName: \"kubernetes.io/projected/0a35f012-3965-4680-aa01-9fa97f956c68-kube-api-access-jcdfn\") pod \"cert-manager-cainjector-cf98fcc89-qr7fv\" (UID: \"0a35f012-3965-4680-aa01-9fa97f956c68\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.472601 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfg2w\" (UniqueName: \"kubernetes.io/projected/2eef62f2-5973-47e2-b921-9e1a05b9f8fb-kube-api-access-qfg2w\") pod \"cert-manager-858654f9db-lhpfw\" (UID: \"2eef62f2-5973-47e2-b921-9e1a05b9f8fb\") " pod="cert-manager/cert-manager-858654f9db-lhpfw" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.474022 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcdfn\" (UniqueName: \"kubernetes.io/projected/0a35f012-3965-4680-aa01-9fa97f956c68-kube-api-access-jcdfn\") pod \"cert-manager-cainjector-cf98fcc89-qr7fv\" (UID: \"0a35f012-3965-4680-aa01-9fa97f956c68\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.474833 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxtwr\" (UniqueName: \"kubernetes.io/projected/09aeffa2-590d-4062-95ff-40dbdda54df7-kube-api-access-fxtwr\") pod \"cert-manager-webhook-687f57d79b-j8rlp\" (UID: \"09aeffa2-590d-4062-95ff-40dbdda54df7\") " pod="cert-manager/cert-manager-webhook-687f57d79b-j8rlp" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.489513 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-j8rlp" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.499635 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-lhpfw" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.770025 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.914219 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-j8rlp"] Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.950772 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-lhpfw"] Mar 09 13:33:35 crc kubenswrapper[4764]: I0309 13:33:35.008767 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv"] Mar 09 13:33:35 crc kubenswrapper[4764]: W0309 13:33:35.009525 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a35f012_3965_4680_aa01_9fa97f956c68.slice/crio-d1cb5ec4000d3723379906e3955229a2d55d6879e00779e674670085fb728cc2 WatchSource:0}: Error finding container d1cb5ec4000d3723379906e3955229a2d55d6879e00779e674670085fb728cc2: Status 404 returned error can't find the container with id d1cb5ec4000d3723379906e3955229a2d55d6879e00779e674670085fb728cc2 Mar 09 13:33:35 crc kubenswrapper[4764]: I0309 13:33:35.266593 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-j8rlp" event={"ID":"09aeffa2-590d-4062-95ff-40dbdda54df7","Type":"ContainerStarted","Data":"d4978b2c656585075e63b58f95bc18a19903727c4830d4aebbc64b22f9312a79"} Mar 09 13:33:35 crc kubenswrapper[4764]: I0309 13:33:35.267728 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-lhpfw" event={"ID":"2eef62f2-5973-47e2-b921-9e1a05b9f8fb","Type":"ContainerStarted","Data":"5011261d07ae7d19f0c1eb436f07682a98b94db716205069e7f407639312cbfa"} Mar 09 13:33:35 crc kubenswrapper[4764]: I0309 13:33:35.269345 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv" event={"ID":"0a35f012-3965-4680-aa01-9fa97f956c68","Type":"ContainerStarted","Data":"d1cb5ec4000d3723379906e3955229a2d55d6879e00779e674670085fb728cc2"} Mar 09 13:33:39 crc kubenswrapper[4764]: I0309 13:33:39.294071 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv" event={"ID":"0a35f012-3965-4680-aa01-9fa97f956c68","Type":"ContainerStarted","Data":"afc8b1247bf6e0d5f3094f3398496746d4f571421819ba50153ad48035c09c70"} Mar 09 13:33:39 crc kubenswrapper[4764]: I0309 13:33:39.295922 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-j8rlp" event={"ID":"09aeffa2-590d-4062-95ff-40dbdda54df7","Type":"ContainerStarted","Data":"612f15629d8e9cf120f39849624aa663b8348f7e24b5bc70f3853c80c8ccad71"} Mar 09 13:33:39 crc kubenswrapper[4764]: I0309 13:33:39.296077 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-j8rlp" Mar 09 13:33:39 crc kubenswrapper[4764]: I0309 13:33:39.297217 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-lhpfw" event={"ID":"2eef62f2-5973-47e2-b921-9e1a05b9f8fb","Type":"ContainerStarted","Data":"960fe3c14eef15d17901031ba54697393727791b60a66308cbbf53854e95f0df"} Mar 09 13:33:39 crc kubenswrapper[4764]: I0309 13:33:39.314122 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv" podStartSLOduration=1.5016224729999998 podStartE2EDuration="5.314100875s" podCreationTimestamp="2026-03-09 13:33:34 +0000 UTC" firstStartedPulling="2026-03-09 13:33:35.012464646 +0000 UTC m=+770.262636554" lastFinishedPulling="2026-03-09 13:33:38.824943048 +0000 UTC m=+774.075114956" observedRunningTime="2026-03-09 13:33:39.311572397 +0000 UTC m=+774.561744315" watchObservedRunningTime="2026-03-09 13:33:39.314100875 +0000 UTC m=+774.564272783" Mar 09 13:33:39 crc kubenswrapper[4764]: I0309 13:33:39.327857 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-j8rlp" podStartSLOduration=1.433348928 podStartE2EDuration="5.327832904s" podCreationTimestamp="2026-03-09 13:33:34 +0000 UTC" firstStartedPulling="2026-03-09 13:33:34.927483332 +0000 UTC m=+770.177655240" lastFinishedPulling="2026-03-09 13:33:38.821967308 +0000 UTC m=+774.072139216" observedRunningTime="2026-03-09 13:33:39.327519655 +0000 UTC m=+774.577691553" watchObservedRunningTime="2026-03-09 13:33:39.327832904 +0000 UTC m=+774.578004822" Mar 09 13:33:39 crc kubenswrapper[4764]: I0309 13:33:39.350307 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-lhpfw" podStartSLOduration=1.423264047 podStartE2EDuration="5.350287667s" podCreationTimestamp="2026-03-09 13:33:34 +0000 UTC" firstStartedPulling="2026-03-09 13:33:34.961290791 +0000 UTC m=+770.211462699" lastFinishedPulling="2026-03-09 13:33:38.888314411 +0000 UTC m=+774.138486319" observedRunningTime="2026-03-09 13:33:39.349224819 +0000 UTC m=+774.599396727" watchObservedRunningTime="2026-03-09 13:33:39.350287667 +0000 UTC m=+774.600459575" Mar 09 13:33:44 crc kubenswrapper[4764]: I0309 13:33:44.492193 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-j8rlp" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.370363 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.371186 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.492091 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7kggv"] Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.492632 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="northd" containerID="cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519" gracePeriod=30 Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.492638 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="sbdb" containerID="cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" gracePeriod=30 Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.492577 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="nbdb" containerID="cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" gracePeriod=30 Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.492818 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="kube-rbac-proxy-node" containerID="cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01" gracePeriod=30 Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.492687 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29" gracePeriod=30 Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.492736 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovn-controller" containerID="cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4" gracePeriod=30 Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.492674 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovn-acl-logging" containerID="cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb" gracePeriod=30 Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.536406 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" containerID="cri-o://00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" gracePeriod=30 Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.722673 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d is running failed: container process not found" containerID="6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.722745 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29 is running failed: container process not found" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.722814 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017 is running failed: container process not found" containerID="df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.723532 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29 is running failed: container process not found" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.723594 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017 is running failed: container process not found" containerID="df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.723665 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d is running failed: container process not found" containerID="6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.727150 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017 is running failed: container process not found" containerID="df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.727173 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d is running failed: container process not found" containerID="6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.727199 4764 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="sbdb" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.727231 4764 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="nbdb" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.727286 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29 is running failed: container process not found" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.727313 4764 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.844009 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/3.log" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.846079 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovn-acl-logging/0.log" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.846654 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovn-controller/0.log" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.847101 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905054 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-65sdb"] Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905314 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905326 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905337 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905346 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905354 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="kube-rbac-proxy-node" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905362 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="kube-rbac-proxy-node" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905379 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="kubecfg-setup" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905386 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="kubecfg-setup" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905396 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovn-acl-logging" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905403 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovn-acl-logging" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905416 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905423 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905430 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="northd" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905437 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="northd" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905445 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="sbdb" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905454 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="sbdb" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905464 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905470 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905478 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovn-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905487 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovn-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905494 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="nbdb" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905500 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="nbdb" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905512 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905519 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905621 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905633 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905659 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905667 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905678 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="sbdb" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905685 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="northd" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905695 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="kube-rbac-proxy-node" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905703 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905711 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovn-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905720 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="nbdb" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905729 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovn-acl-logging" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905864 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905875 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905995 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.908251 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004199 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004274 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004278 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-netns\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004335 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004714 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-config\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004784 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-log-socket\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004802 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-openvswitch\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004829 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovn-node-metrics-cert\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004860 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-env-overrides\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004887 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-script-lib\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004908 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004940 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004908 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-log-socket" (OuterVolumeSpecName: "log-socket") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004919 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-var-lib-openvswitch\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004979 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-ovn-kubernetes\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005002 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-slash\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005036 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-bin\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005065 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5xrv\" (UniqueName: \"kubernetes.io/projected/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-kube-api-access-r5xrv\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005082 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-systemd\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005098 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-ovn\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005121 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-etc-openvswitch\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005166 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-node-log\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005204 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-systemd-units\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005221 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-kubelet\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005259 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-netd\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005326 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005364 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005411 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-node-log" (OuterVolumeSpecName: "node-log") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005408 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005438 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005449 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005458 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005480 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005483 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-slash" (OuterVolumeSpecName: "host-slash") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005491 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-run-systemd\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005498 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005531 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-kubelet\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005511 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005512 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005609 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-run-netns\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005693 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ce85b1d-79b3-4669-a169-bfcd058c8931-ovnkube-script-lib\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005774 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-node-log\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005829 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-slash\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005876 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-cni-netd\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005959 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-cni-bin\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005996 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ce85b1d-79b3-4669-a169-bfcd058c8931-ovnkube-config\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006024 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-systemd-units\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006075 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ce85b1d-79b3-4669-a169-bfcd058c8931-env-overrides\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006104 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-run-openvswitch\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006180 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006209 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcw8n\" (UniqueName: \"kubernetes.io/projected/9ce85b1d-79b3-4669-a169-bfcd058c8931-kube-api-access-kcw8n\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006229 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-etc-openvswitch\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006303 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-log-socket\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006335 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ce85b1d-79b3-4669-a169-bfcd058c8931-ovn-node-metrics-cert\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006378 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-run-ovn-kubernetes\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006401 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-run-ovn\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006419 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-var-lib-openvswitch\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006510 4764 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006531 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006547 4764 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006560 4764 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006573 4764 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-slash\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006584 4764 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006598 4764 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006612 4764 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006627 4764 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-node-log\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006661 4764 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006677 4764 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006692 4764 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006707 4764 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006719 4764 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006731 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006745 4764 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-log-socket\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006755 4764 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.010778 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.011242 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-kube-api-access-r5xrv" (OuterVolumeSpecName: "kube-api-access-r5xrv") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "kube-api-access-r5xrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.018382 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107151 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ce85b1d-79b3-4669-a169-bfcd058c8931-env-overrides\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107209 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-run-openvswitch\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107246 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107268 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcw8n\" (UniqueName: \"kubernetes.io/projected/9ce85b1d-79b3-4669-a169-bfcd058c8931-kube-api-access-kcw8n\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107294 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-etc-openvswitch\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107299 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-run-openvswitch\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107315 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ce85b1d-79b3-4669-a169-bfcd058c8931-ovn-node-metrics-cert\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107337 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-log-socket\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107351 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-etc-openvswitch\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107361 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-run-ovn-kubernetes\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107386 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-run-ovn\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107408 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-var-lib-openvswitch\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107416 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107460 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-run-systemd\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107430 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-run-systemd\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107504 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-run-ovn-kubernetes\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107513 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-log-socket\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107560 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-kubelet\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107541 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-run-ovn\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107529 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-kubelet\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107565 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-var-lib-openvswitch\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107735 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-run-netns\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107780 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ce85b1d-79b3-4669-a169-bfcd058c8931-ovnkube-script-lib\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107820 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-node-log\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107852 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-run-netns\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107885 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-slash\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107911 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-node-log\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107917 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-cni-netd\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107927 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-slash\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107979 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-cni-netd\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.108005 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-cni-bin\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.108038 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ce85b1d-79b3-4669-a169-bfcd058c8931-ovnkube-config\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.108077 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-systemd-units\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.108108 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-cni-bin\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.108201 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-systemd-units\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.108229 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.108293 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5xrv\" (UniqueName: \"kubernetes.io/projected/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-kube-api-access-r5xrv\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.108312 4764 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.108488 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ce85b1d-79b3-4669-a169-bfcd058c8931-ovnkube-script-lib\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.109308 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ce85b1d-79b3-4669-a169-bfcd058c8931-env-overrides\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.109666 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ce85b1d-79b3-4669-a169-bfcd058c8931-ovnkube-config\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.110289 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ce85b1d-79b3-4669-a169-bfcd058c8931-ovn-node-metrics-cert\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.126848 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcw8n\" (UniqueName: \"kubernetes.io/projected/9ce85b1d-79b3-4669-a169-bfcd058c8931-kube-api-access-kcw8n\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.223068 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: W0309 13:33:59.241968 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ce85b1d_79b3_4669_a169_bfcd058c8931.slice/crio-3d8a2a2f3b83361d950fc0816e501c19b17d60a681dbf475fb25e36cffe9c59e WatchSource:0}: Error finding container 3d8a2a2f3b83361d950fc0816e501c19b17d60a681dbf475fb25e36cffe9c59e: Status 404 returned error can't find the container with id 3d8a2a2f3b83361d950fc0816e501c19b17d60a681dbf475fb25e36cffe9c59e Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.420519 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/3.log" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.424107 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovn-acl-logging/0.log" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.424860 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovn-controller/0.log" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425360 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" exitCode=0 Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425400 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" exitCode=0 Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425421 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" exitCode=0 Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425434 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519" exitCode=0 Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425447 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29" exitCode=0 Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425462 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01" exitCode=0 Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425469 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425492 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425529 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425547 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425565 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425584 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425602 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425621 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425637 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425679 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425690 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425701 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425710 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425721 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425732 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425742 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425757 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425774 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425786 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425475 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb" exitCode=143 Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425812 4764 scope.go:117] "RemoveContainer" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425818 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4" exitCode=143 Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425798 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425864 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425871 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425878 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425883 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425888 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425894 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425899 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425908 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425920 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425927 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425933 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425939 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425945 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425951 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425957 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425963 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425969 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425982 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425991 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"65054061a375abf79424d7095dced33c492d7e93ee2f22f668abd5a0c5ced956"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.426000 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.426007 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.426012 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.426018 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.426023 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.426028 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.426035 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.426041 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.426046 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.426052 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.429561 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" event={"ID":"9ce85b1d-79b3-4669-a169-bfcd058c8931","Type":"ContainerDied","Data":"14662e1e519ca2bde52ecac49d404da06daa59e83344f6bc1a62155769c808e5"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.445607 4764 scope.go:117] "RemoveContainer" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.429527 4764 generic.go:334] "Generic (PLEG): container finished" podID="9ce85b1d-79b3-4669-a169-bfcd058c8931" containerID="14662e1e519ca2bde52ecac49d404da06daa59e83344f6bc1a62155769c808e5" exitCode=0 Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.445936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" event={"ID":"9ce85b1d-79b3-4669-a169-bfcd058c8931","Type":"ContainerStarted","Data":"3d8a2a2f3b83361d950fc0816e501c19b17d60a681dbf475fb25e36cffe9c59e"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.448741 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmzm7_202a1f58-ce83-4374-ac48-dc806f7b9d6b/kube-multus/2.log" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.449615 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmzm7_202a1f58-ce83-4374-ac48-dc806f7b9d6b/kube-multus/1.log" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.449734 4764 generic.go:334] "Generic (PLEG): container finished" podID="202a1f58-ce83-4374-ac48-dc806f7b9d6b" containerID="63894bb3203376238f9013fa9abe0d5e50f67b5675ce93e7ce0b6cdd92b326b3" exitCode=2 Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.449765 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmzm7" event={"ID":"202a1f58-ce83-4374-ac48-dc806f7b9d6b","Type":"ContainerDied","Data":"63894bb3203376238f9013fa9abe0d5e50f67b5675ce93e7ce0b6cdd92b326b3"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.450369 4764 scope.go:117] "RemoveContainer" containerID="63894bb3203376238f9013fa9abe0d5e50f67b5675ce93e7ce0b6cdd92b326b3" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.451314 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-zmzm7_openshift-multus(202a1f58-ce83-4374-ac48-dc806f7b9d6b)\"" pod="openshift-multus/multus-zmzm7" podUID="202a1f58-ce83-4374-ac48-dc806f7b9d6b" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.492709 4764 scope.go:117] "RemoveContainer" containerID="df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.514048 4764 scope.go:117] "RemoveContainer" containerID="6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.526006 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7kggv"] Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.532271 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7kggv"] Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.535834 4764 scope.go:117] "RemoveContainer" containerID="20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.558715 4764 scope.go:117] "RemoveContainer" containerID="21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.566851 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" path="/var/lib/kubelet/pods/b8ccb4f5-550a-41b2-b39d-201cdd5d902a/volumes" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.575534 4764 scope.go:117] "RemoveContainer" containerID="9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.590093 4764 scope.go:117] "RemoveContainer" containerID="8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.606021 4764 scope.go:117] "RemoveContainer" containerID="9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.633468 4764 scope.go:117] "RemoveContainer" containerID="cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.675802 4764 scope.go:117] "RemoveContainer" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.676302 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": container with ID starting with 00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29 not found: ID does not exist" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.676333 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29"} err="failed to get container status \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": rpc error: code = NotFound desc = could not find container \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": container with ID starting with 00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.676355 4764 scope.go:117] "RemoveContainer" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.676760 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\": container with ID starting with 9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f not found: ID does not exist" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.676781 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f"} err="failed to get container status \"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\": rpc error: code = NotFound desc = could not find container \"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\": container with ID starting with 9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.676794 4764 scope.go:117] "RemoveContainer" containerID="df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.677061 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\": container with ID starting with df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017 not found: ID does not exist" containerID="df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.677082 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017"} err="failed to get container status \"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\": rpc error: code = NotFound desc = could not find container \"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\": container with ID starting with df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.677096 4764 scope.go:117] "RemoveContainer" containerID="6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.677450 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\": container with ID starting with 6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d not found: ID does not exist" containerID="6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.677468 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d"} err="failed to get container status \"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\": rpc error: code = NotFound desc = could not find container \"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\": container with ID starting with 6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.677479 4764 scope.go:117] "RemoveContainer" containerID="20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.677835 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\": container with ID starting with 20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519 not found: ID does not exist" containerID="20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.677855 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519"} err="failed to get container status \"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\": rpc error: code = NotFound desc = could not find container \"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\": container with ID starting with 20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.677868 4764 scope.go:117] "RemoveContainer" containerID="21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.678104 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\": container with ID starting with 21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29 not found: ID does not exist" containerID="21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.678120 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29"} err="failed to get container status \"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\": rpc error: code = NotFound desc = could not find container \"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\": container with ID starting with 21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.678133 4764 scope.go:117] "RemoveContainer" containerID="9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.678403 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\": container with ID starting with 9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01 not found: ID does not exist" containerID="9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.678429 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01"} err="failed to get container status \"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\": rpc error: code = NotFound desc = could not find container \"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\": container with ID starting with 9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.678443 4764 scope.go:117] "RemoveContainer" containerID="8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.678677 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\": container with ID starting with 8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb not found: ID does not exist" containerID="8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.678697 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb"} err="failed to get container status \"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\": rpc error: code = NotFound desc = could not find container \"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\": container with ID starting with 8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.678711 4764 scope.go:117] "RemoveContainer" containerID="9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.678927 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\": container with ID starting with 9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4 not found: ID does not exist" containerID="9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.678946 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4"} err="failed to get container status \"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\": rpc error: code = NotFound desc = could not find container \"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\": container with ID starting with 9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.678959 4764 scope.go:117] "RemoveContainer" containerID="cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.679251 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\": container with ID starting with cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66 not found: ID does not exist" containerID="cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.679269 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66"} err="failed to get container status \"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\": rpc error: code = NotFound desc = could not find container \"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\": container with ID starting with cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.679281 4764 scope.go:117] "RemoveContainer" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.679479 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29"} err="failed to get container status \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": rpc error: code = NotFound desc = could not find container \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": container with ID starting with 00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.679495 4764 scope.go:117] "RemoveContainer" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.679742 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f"} err="failed to get container status \"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\": rpc error: code = NotFound desc = could not find container \"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\": container with ID starting with 9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.679759 4764 scope.go:117] "RemoveContainer" containerID="df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.679966 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017"} err="failed to get container status \"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\": rpc error: code = NotFound desc = could not find container \"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\": container with ID starting with df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.679983 4764 scope.go:117] "RemoveContainer" containerID="6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.680234 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d"} err="failed to get container status \"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\": rpc error: code = NotFound desc = could not find container \"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\": container with ID starting with 6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.680253 4764 scope.go:117] "RemoveContainer" containerID="20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.681165 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519"} err="failed to get container status \"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\": rpc error: code = NotFound desc = could not find container \"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\": container with ID starting with 20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.681190 4764 scope.go:117] "RemoveContainer" containerID="21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.681482 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29"} err="failed to get container status \"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\": rpc error: code = NotFound desc = could not find container \"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\": container with ID starting with 21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.681501 4764 scope.go:117] "RemoveContainer" containerID="9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.681854 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01"} err="failed to get container status \"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\": rpc error: code = NotFound desc = could not find container \"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\": container with ID starting with 9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.681874 4764 scope.go:117] "RemoveContainer" containerID="8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.682440 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb"} err="failed to get container status \"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\": rpc error: code = NotFound desc = could not find container \"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\": container with ID starting with 8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.682458 4764 scope.go:117] "RemoveContainer" containerID="9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.682690 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4"} err="failed to get container status \"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\": rpc error: code = NotFound desc = could not find container \"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\": container with ID starting with 9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.682712 4764 scope.go:117] "RemoveContainer" containerID="cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.683280 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66"} err="failed to get container status \"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\": rpc error: code = NotFound desc = could not find container \"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\": container with ID starting with cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.683325 4764 scope.go:117] "RemoveContainer" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.683704 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29"} err="failed to get container status \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": rpc error: code = NotFound desc = could not find container \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": container with ID starting with 00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.683723 4764 scope.go:117] "RemoveContainer" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.683922 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f"} err="failed to get container status \"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\": rpc error: code = NotFound desc = could not find container \"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\": container with ID starting with 9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.683936 4764 scope.go:117] "RemoveContainer" containerID="df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.684140 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017"} err="failed to get container status \"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\": rpc error: code = NotFound desc = could not find container \"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\": container with ID starting with df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.684151 4764 scope.go:117] "RemoveContainer" containerID="6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.684497 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d"} err="failed to get container status \"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\": rpc error: code = NotFound desc = could not find container \"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\": container with ID starting with 6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.684512 4764 scope.go:117] "RemoveContainer" containerID="20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.684809 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519"} err="failed to get container status \"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\": rpc error: code = NotFound desc = could not find container \"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\": container with ID starting with 20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.684826 4764 scope.go:117] "RemoveContainer" containerID="21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.685346 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29"} err="failed to get container status \"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\": rpc error: code = NotFound desc = could not find container \"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\": container with ID starting with 21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.685373 4764 scope.go:117] "RemoveContainer" containerID="9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.685689 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01"} err="failed to get container status \"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\": rpc error: code = NotFound desc = could not find container \"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\": container with ID starting with 9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.685714 4764 scope.go:117] "RemoveContainer" containerID="8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.686005 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb"} err="failed to get container status \"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\": rpc error: code = NotFound desc = could not find container \"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\": container with ID starting with 8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.686030 4764 scope.go:117] "RemoveContainer" containerID="9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.686219 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4"} err="failed to get container status \"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\": rpc error: code = NotFound desc = could not find container \"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\": container with ID starting with 9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.686238 4764 scope.go:117] "RemoveContainer" containerID="cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.686417 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66"} err="failed to get container status \"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\": rpc error: code = NotFound desc = could not find container \"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\": container with ID starting with cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.686436 4764 scope.go:117] "RemoveContainer" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.686693 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29"} err="failed to get container status \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": rpc error: code = NotFound desc = could not find container \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": container with ID starting with 00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.686715 4764 scope.go:117] "RemoveContainer" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.687034 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f"} err="failed to get container status \"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\": rpc error: code = NotFound desc = could not find container \"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\": container with ID starting with 9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.687054 4764 scope.go:117] "RemoveContainer" containerID="df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.687340 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017"} err="failed to get container status \"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\": rpc error: code = NotFound desc = could not find container \"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\": container with ID starting with df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.687360 4764 scope.go:117] "RemoveContainer" containerID="6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.687525 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d"} err="failed to get container status \"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\": rpc error: code = NotFound desc = could not find container \"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\": container with ID starting with 6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.687547 4764 scope.go:117] "RemoveContainer" containerID="20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.687851 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519"} err="failed to get container status \"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\": rpc error: code = NotFound desc = could not find container \"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\": container with ID starting with 20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.687879 4764 scope.go:117] "RemoveContainer" containerID="21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.688158 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29"} err="failed to get container status \"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\": rpc error: code = NotFound desc = could not find container \"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\": container with ID starting with 21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.688199 4764 scope.go:117] "RemoveContainer" containerID="9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.688590 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01"} err="failed to get container status \"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\": rpc error: code = NotFound desc = could not find container \"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\": container with ID starting with 9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.688620 4764 scope.go:117] "RemoveContainer" containerID="8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.688896 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb"} err="failed to get container status \"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\": rpc error: code = NotFound desc = could not find container \"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\": container with ID starting with 8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.688923 4764 scope.go:117] "RemoveContainer" containerID="9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.689173 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4"} err="failed to get container status \"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\": rpc error: code = NotFound desc = could not find container \"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\": container with ID starting with 9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.689196 4764 scope.go:117] "RemoveContainer" containerID="cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.689363 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66"} err="failed to get container status \"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\": rpc error: code = NotFound desc = could not find container \"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\": container with ID starting with cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.689384 4764 scope.go:117] "RemoveContainer" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.689563 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29"} err="failed to get container status \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": rpc error: code = NotFound desc = could not find container \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": container with ID starting with 00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.689580 4764 scope.go:117] "RemoveContainer" containerID="ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a" Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.143677 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551054-n54vp"] Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.145442 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.147143 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.147766 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.147868 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.227726 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5wg8\" (UniqueName: \"kubernetes.io/projected/034371f5-4d6d-4a44-9678-9093ffaf3f9d-kube-api-access-c5wg8\") pod \"auto-csr-approver-29551054-n54vp\" (UID: \"034371f5-4d6d-4a44-9678-9093ffaf3f9d\") " pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.328873 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5wg8\" (UniqueName: \"kubernetes.io/projected/034371f5-4d6d-4a44-9678-9093ffaf3f9d-kube-api-access-c5wg8\") pod \"auto-csr-approver-29551054-n54vp\" (UID: \"034371f5-4d6d-4a44-9678-9093ffaf3f9d\") " pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.348863 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5wg8\" (UniqueName: \"kubernetes.io/projected/034371f5-4d6d-4a44-9678-9093ffaf3f9d-kube-api-access-c5wg8\") pod \"auto-csr-approver-29551054-n54vp\" (UID: \"034371f5-4d6d-4a44-9678-9093ffaf3f9d\") " pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.465467 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" event={"ID":"9ce85b1d-79b3-4669-a169-bfcd058c8931","Type":"ContainerStarted","Data":"3a848f9e9d70d363afd145136252d0d21b1181fb7c9635668eaf736178838a28"} Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.465511 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" event={"ID":"9ce85b1d-79b3-4669-a169-bfcd058c8931","Type":"ContainerStarted","Data":"48ef6cb01cdd4dbf12a234f8bbe08162d6b5f52d24cd627d574666c158014d27"} Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.465529 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" event={"ID":"9ce85b1d-79b3-4669-a169-bfcd058c8931","Type":"ContainerStarted","Data":"f69f8673226b23344a41861642361b428c6c3e1d43f01287b2dd7b6e001dfe23"} Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.465546 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" event={"ID":"9ce85b1d-79b3-4669-a169-bfcd058c8931","Type":"ContainerStarted","Data":"0d63bb6c85375e48eceb743e37a5a244484ac0d049204cd33b8129a9bd67940d"} Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.465562 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" event={"ID":"9ce85b1d-79b3-4669-a169-bfcd058c8931","Type":"ContainerStarted","Data":"cfdb7889f60234e9433e89a0e499d8835f26ab539859bcf0cf95231ed76a4806"} Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.465582 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" event={"ID":"9ce85b1d-79b3-4669-a169-bfcd058c8931","Type":"ContainerStarted","Data":"5c89cfde8a656ee956f7f0ec25cad509f0fe6c8e2d7060684cadebb71cfa2ac0"} Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.467378 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmzm7_202a1f58-ce83-4374-ac48-dc806f7b9d6b/kube-multus/2.log" Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.467523 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:00 crc kubenswrapper[4764]: E0309 13:34:00.518483 4764 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(9371105bcadd648276b8c0628bf26340a2e88cbd213aba3b9c529f1027adc240): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:34:00 crc kubenswrapper[4764]: E0309 13:34:00.518570 4764 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(9371105bcadd648276b8c0628bf26340a2e88cbd213aba3b9c529f1027adc240): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:00 crc kubenswrapper[4764]: E0309 13:34:00.518599 4764 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(9371105bcadd648276b8c0628bf26340a2e88cbd213aba3b9c529f1027adc240): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:00 crc kubenswrapper[4764]: E0309 13:34:00.518680 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29551054-n54vp_openshift-infra(034371f5-4d6d-4a44-9678-9093ffaf3f9d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29551054-n54vp_openshift-infra(034371f5-4d6d-4a44-9678-9093ffaf3f9d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(9371105bcadd648276b8c0628bf26340a2e88cbd213aba3b9c529f1027adc240): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29551054-n54vp" podUID="034371f5-4d6d-4a44-9678-9093ffaf3f9d" Mar 09 13:34:02 crc kubenswrapper[4764]: I0309 13:34:02.485263 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" event={"ID":"9ce85b1d-79b3-4669-a169-bfcd058c8931","Type":"ContainerStarted","Data":"d6c45db45d998d410c60e72adb261563bb996c6a11ee9e2df9458db6ea6f17de"} Mar 09 13:34:05 crc kubenswrapper[4764]: I0309 13:34:05.391519 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551054-n54vp"] Mar 09 13:34:05 crc kubenswrapper[4764]: I0309 13:34:05.392254 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:05 crc kubenswrapper[4764]: I0309 13:34:05.392707 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:05 crc kubenswrapper[4764]: E0309 13:34:05.420729 4764 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(490a791bba571dc39fa37f985dc6a1cdbff4000817835154f83fb9398601bb34): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:34:05 crc kubenswrapper[4764]: E0309 13:34:05.420831 4764 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(490a791bba571dc39fa37f985dc6a1cdbff4000817835154f83fb9398601bb34): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:05 crc kubenswrapper[4764]: E0309 13:34:05.420858 4764 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(490a791bba571dc39fa37f985dc6a1cdbff4000817835154f83fb9398601bb34): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:05 crc kubenswrapper[4764]: E0309 13:34:05.420935 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29551054-n54vp_openshift-infra(034371f5-4d6d-4a44-9678-9093ffaf3f9d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29551054-n54vp_openshift-infra(034371f5-4d6d-4a44-9678-9093ffaf3f9d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(490a791bba571dc39fa37f985dc6a1cdbff4000817835154f83fb9398601bb34): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29551054-n54vp" podUID="034371f5-4d6d-4a44-9678-9093ffaf3f9d" Mar 09 13:34:05 crc kubenswrapper[4764]: I0309 13:34:05.508253 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" event={"ID":"9ce85b1d-79b3-4669-a169-bfcd058c8931","Type":"ContainerStarted","Data":"0b696165846fc1f0bdec336ce0be47fd6364eb835170524ef7e828f6e1debe3b"} Mar 09 13:34:05 crc kubenswrapper[4764]: I0309 13:34:05.508872 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:34:05 crc kubenswrapper[4764]: I0309 13:34:05.508966 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:34:05 crc kubenswrapper[4764]: I0309 13:34:05.540369 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" podStartSLOduration=7.540341433 podStartE2EDuration="7.540341433s" podCreationTimestamp="2026-03-09 13:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:34:05.536562522 +0000 UTC m=+800.786734420" watchObservedRunningTime="2026-03-09 13:34:05.540341433 +0000 UTC m=+800.790513351" Mar 09 13:34:05 crc kubenswrapper[4764]: I0309 13:34:05.547164 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:34:06 crc kubenswrapper[4764]: I0309 13:34:06.514768 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:34:06 crc kubenswrapper[4764]: I0309 13:34:06.550699 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:34:14 crc kubenswrapper[4764]: I0309 13:34:14.559294 4764 scope.go:117] "RemoveContainer" containerID="63894bb3203376238f9013fa9abe0d5e50f67b5675ce93e7ce0b6cdd92b326b3" Mar 09 13:34:14 crc kubenswrapper[4764]: E0309 13:34:14.560154 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-zmzm7_openshift-multus(202a1f58-ce83-4374-ac48-dc806f7b9d6b)\"" pod="openshift-multus/multus-zmzm7" podUID="202a1f58-ce83-4374-ac48-dc806f7b9d6b" Mar 09 13:34:19 crc kubenswrapper[4764]: I0309 13:34:19.940915 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v"] Mar 09 13:34:19 crc kubenswrapper[4764]: I0309 13:34:19.942817 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:19 crc kubenswrapper[4764]: I0309 13:34:19.946038 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 13:34:19 crc kubenswrapper[4764]: I0309 13:34:19.953180 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v"] Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.049331 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.049407 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.049433 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-775rf\" (UniqueName: \"kubernetes.io/projected/704ddae7-42eb-4609-b4a3-64d5078c2126-kube-api-access-775rf\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.150719 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.150766 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-775rf\" (UniqueName: \"kubernetes.io/projected/704ddae7-42eb-4609-b4a3-64d5078c2126-kube-api-access-775rf\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.150821 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.151243 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.151291 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.170470 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-775rf\" (UniqueName: \"kubernetes.io/projected/704ddae7-42eb-4609-b4a3-64d5078c2126-kube-api-access-775rf\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.259529 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.283297 4764 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace_704ddae7-42eb-4609-b4a3-64d5078c2126_0(6ea19c3eacc0d52e3b0d6826607088e7cd227b4b59271aefdeacb66597509e63): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.283360 4764 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace_704ddae7-42eb-4609-b4a3-64d5078c2126_0(6ea19c3eacc0d52e3b0d6826607088e7cd227b4b59271aefdeacb66597509e63): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.283384 4764 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace_704ddae7-42eb-4609-b4a3-64d5078c2126_0(6ea19c3eacc0d52e3b0d6826607088e7cd227b4b59271aefdeacb66597509e63): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.283437 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace(704ddae7-42eb-4609-b4a3-64d5078c2126)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace(704ddae7-42eb-4609-b4a3-64d5078c2126)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace_704ddae7-42eb-4609-b4a3-64d5078c2126_0(6ea19c3eacc0d52e3b0d6826607088e7cd227b4b59271aefdeacb66597509e63): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" podUID="704ddae7-42eb-4609-b4a3-64d5078c2126" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.559058 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.560111 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.582450 4764 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(aed54c1a64f25b2e95214eed47e79e32b233372ce821e9e07390830d4703c06c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.582545 4764 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(aed54c1a64f25b2e95214eed47e79e32b233372ce821e9e07390830d4703c06c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.582578 4764 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(aed54c1a64f25b2e95214eed47e79e32b233372ce821e9e07390830d4703c06c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.582662 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29551054-n54vp_openshift-infra(034371f5-4d6d-4a44-9678-9093ffaf3f9d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29551054-n54vp_openshift-infra(034371f5-4d6d-4a44-9678-9093ffaf3f9d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(aed54c1a64f25b2e95214eed47e79e32b233372ce821e9e07390830d4703c06c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29551054-n54vp" podUID="034371f5-4d6d-4a44-9678-9093ffaf3f9d" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.608821 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.609398 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.630565 4764 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace_704ddae7-42eb-4609-b4a3-64d5078c2126_0(9cc09cde10f092620ca58e98e418d3793b0b63299d57f13e1dba229626cd06c2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.630634 4764 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace_704ddae7-42eb-4609-b4a3-64d5078c2126_0(9cc09cde10f092620ca58e98e418d3793b0b63299d57f13e1dba229626cd06c2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.630677 4764 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace_704ddae7-42eb-4609-b4a3-64d5078c2126_0(9cc09cde10f092620ca58e98e418d3793b0b63299d57f13e1dba229626cd06c2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.630727 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace(704ddae7-42eb-4609-b4a3-64d5078c2126)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace(704ddae7-42eb-4609-b4a3-64d5078c2126)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace_704ddae7-42eb-4609-b4a3-64d5078c2126_0(9cc09cde10f092620ca58e98e418d3793b0b63299d57f13e1dba229626cd06c2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" podUID="704ddae7-42eb-4609-b4a3-64d5078c2126" Mar 09 13:34:28 crc kubenswrapper[4764]: I0309 13:34:28.370512 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:34:28 crc kubenswrapper[4764]: I0309 13:34:28.371954 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:34:28 crc kubenswrapper[4764]: I0309 13:34:28.559301 4764 scope.go:117] "RemoveContainer" containerID="63894bb3203376238f9013fa9abe0d5e50f67b5675ce93e7ce0b6cdd92b326b3" Mar 09 13:34:29 crc kubenswrapper[4764]: I0309 13:34:29.248178 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:34:29 crc kubenswrapper[4764]: I0309 13:34:29.676968 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmzm7_202a1f58-ce83-4374-ac48-dc806f7b9d6b/kube-multus/2.log" Mar 09 13:34:29 crc kubenswrapper[4764]: I0309 13:34:29.677976 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmzm7" event={"ID":"202a1f58-ce83-4374-ac48-dc806f7b9d6b","Type":"ContainerStarted","Data":"77ff1aa6c5c0eeb845444895ddf031c60c4d096f2fb0c65b0a27e0b43cd150c0"} Mar 09 13:34:34 crc kubenswrapper[4764]: I0309 13:34:34.559768 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:34 crc kubenswrapper[4764]: I0309 13:34:34.559823 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:34 crc kubenswrapper[4764]: I0309 13:34:34.560498 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:34 crc kubenswrapper[4764]: I0309 13:34:34.560772 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:34 crc kubenswrapper[4764]: I0309 13:34:34.775201 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551054-n54vp"] Mar 09 13:34:34 crc kubenswrapper[4764]: I0309 13:34:34.822375 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v"] Mar 09 13:34:34 crc kubenswrapper[4764]: W0309 13:34:34.827434 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod704ddae7_42eb_4609_b4a3_64d5078c2126.slice/crio-1aae20624cf4f8dfb3e9bc0fa86cea95ff0b6ca3747b8493fe018a0d37430bcf WatchSource:0}: Error finding container 1aae20624cf4f8dfb3e9bc0fa86cea95ff0b6ca3747b8493fe018a0d37430bcf: Status 404 returned error can't find the container with id 1aae20624cf4f8dfb3e9bc0fa86cea95ff0b6ca3747b8493fe018a0d37430bcf Mar 09 13:34:34 crc kubenswrapper[4764]: I0309 13:34:34.855080 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" event={"ID":"704ddae7-42eb-4609-b4a3-64d5078c2126","Type":"ContainerStarted","Data":"1aae20624cf4f8dfb3e9bc0fa86cea95ff0b6ca3747b8493fe018a0d37430bcf"} Mar 09 13:34:34 crc kubenswrapper[4764]: I0309 13:34:34.856539 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551054-n54vp" event={"ID":"034371f5-4d6d-4a44-9678-9093ffaf3f9d","Type":"ContainerStarted","Data":"d80da71a220cdeff4fcb886a07bef2e0df9a89242a263d781a6886d65252f40a"} Mar 09 13:34:35 crc kubenswrapper[4764]: I0309 13:34:35.866771 4764 generic.go:334] "Generic (PLEG): container finished" podID="704ddae7-42eb-4609-b4a3-64d5078c2126" containerID="7b5418c090975c0574a7d218bf987e53c3ec2e85e4ff9a68edfbfb8766c1af5a" exitCode=0 Mar 09 13:34:35 crc kubenswrapper[4764]: I0309 13:34:35.866845 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" event={"ID":"704ddae7-42eb-4609-b4a3-64d5078c2126","Type":"ContainerDied","Data":"7b5418c090975c0574a7d218bf987e53c3ec2e85e4ff9a68edfbfb8766c1af5a"} Mar 09 13:34:36 crc kubenswrapper[4764]: I0309 13:34:36.878162 4764 generic.go:334] "Generic (PLEG): container finished" podID="034371f5-4d6d-4a44-9678-9093ffaf3f9d" containerID="33a445a6adcd631bbd08283597c203355f22e04e83cf13fd98ba9c1f71c00bd6" exitCode=0 Mar 09 13:34:36 crc kubenswrapper[4764]: I0309 13:34:36.878302 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551054-n54vp" event={"ID":"034371f5-4d6d-4a44-9678-9093ffaf3f9d","Type":"ContainerDied","Data":"33a445a6adcd631bbd08283597c203355f22e04e83cf13fd98ba9c1f71c00bd6"} Mar 09 13:34:38 crc kubenswrapper[4764]: I0309 13:34:38.116550 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:38 crc kubenswrapper[4764]: I0309 13:34:38.260331 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5wg8\" (UniqueName: \"kubernetes.io/projected/034371f5-4d6d-4a44-9678-9093ffaf3f9d-kube-api-access-c5wg8\") pod \"034371f5-4d6d-4a44-9678-9093ffaf3f9d\" (UID: \"034371f5-4d6d-4a44-9678-9093ffaf3f9d\") " Mar 09 13:34:38 crc kubenswrapper[4764]: I0309 13:34:38.268099 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/034371f5-4d6d-4a44-9678-9093ffaf3f9d-kube-api-access-c5wg8" (OuterVolumeSpecName: "kube-api-access-c5wg8") pod "034371f5-4d6d-4a44-9678-9093ffaf3f9d" (UID: "034371f5-4d6d-4a44-9678-9093ffaf3f9d"). InnerVolumeSpecName "kube-api-access-c5wg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:34:38 crc kubenswrapper[4764]: I0309 13:34:38.363691 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5wg8\" (UniqueName: \"kubernetes.io/projected/034371f5-4d6d-4a44-9678-9093ffaf3f9d-kube-api-access-c5wg8\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:38 crc kubenswrapper[4764]: I0309 13:34:38.895346 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551054-n54vp" event={"ID":"034371f5-4d6d-4a44-9678-9093ffaf3f9d","Type":"ContainerDied","Data":"d80da71a220cdeff4fcb886a07bef2e0df9a89242a263d781a6886d65252f40a"} Mar 09 13:34:38 crc kubenswrapper[4764]: I0309 13:34:38.895394 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d80da71a220cdeff4fcb886a07bef2e0df9a89242a263d781a6886d65252f40a" Mar 09 13:34:38 crc kubenswrapper[4764]: I0309 13:34:38.895423 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:39 crc kubenswrapper[4764]: I0309 13:34:39.181078 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551048-fgf8g"] Mar 09 13:34:39 crc kubenswrapper[4764]: I0309 13:34:39.184134 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551048-fgf8g"] Mar 09 13:34:39 crc kubenswrapper[4764]: I0309 13:34:39.577580 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f" path="/var/lib/kubelet/pods/0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f/volumes" Mar 09 13:34:41 crc kubenswrapper[4764]: I0309 13:34:41.920482 4764 generic.go:334] "Generic (PLEG): container finished" podID="704ddae7-42eb-4609-b4a3-64d5078c2126" containerID="4d4fb12cd75aa4f4f3ec85ed29ddf07e07feed48b0270424b85035c7f01f3e24" exitCode=0 Mar 09 13:34:41 crc kubenswrapper[4764]: I0309 13:34:41.921083 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" event={"ID":"704ddae7-42eb-4609-b4a3-64d5078c2126","Type":"ContainerDied","Data":"4d4fb12cd75aa4f4f3ec85ed29ddf07e07feed48b0270424b85035c7f01f3e24"} Mar 09 13:34:42 crc kubenswrapper[4764]: I0309 13:34:42.931296 4764 generic.go:334] "Generic (PLEG): container finished" podID="704ddae7-42eb-4609-b4a3-64d5078c2126" containerID="fa6fe455ac1be47605b80c072783a99991320268046d70c8097db7878a15cacb" exitCode=0 Mar 09 13:34:42 crc kubenswrapper[4764]: I0309 13:34:42.931352 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" event={"ID":"704ddae7-42eb-4609-b4a3-64d5078c2126","Type":"ContainerDied","Data":"fa6fe455ac1be47605b80c072783a99991320268046d70c8097db7878a15cacb"} Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.239245 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.264322 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-775rf\" (UniqueName: \"kubernetes.io/projected/704ddae7-42eb-4609-b4a3-64d5078c2126-kube-api-access-775rf\") pod \"704ddae7-42eb-4609-b4a3-64d5078c2126\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.264384 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-bundle\") pod \"704ddae7-42eb-4609-b4a3-64d5078c2126\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.264564 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-util\") pod \"704ddae7-42eb-4609-b4a3-64d5078c2126\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.265385 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-bundle" (OuterVolumeSpecName: "bundle") pod "704ddae7-42eb-4609-b4a3-64d5078c2126" (UID: "704ddae7-42eb-4609-b4a3-64d5078c2126"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.272873 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/704ddae7-42eb-4609-b4a3-64d5078c2126-kube-api-access-775rf" (OuterVolumeSpecName: "kube-api-access-775rf") pod "704ddae7-42eb-4609-b4a3-64d5078c2126" (UID: "704ddae7-42eb-4609-b4a3-64d5078c2126"). InnerVolumeSpecName "kube-api-access-775rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.276106 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-util" (OuterVolumeSpecName: "util") pod "704ddae7-42eb-4609-b4a3-64d5078c2126" (UID: "704ddae7-42eb-4609-b4a3-64d5078c2126"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.366634 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-775rf\" (UniqueName: \"kubernetes.io/projected/704ddae7-42eb-4609-b4a3-64d5078c2126-kube-api-access-775rf\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.367121 4764 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.367137 4764 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-util\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.950598 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" event={"ID":"704ddae7-42eb-4609-b4a3-64d5078c2126","Type":"ContainerDied","Data":"1aae20624cf4f8dfb3e9bc0fa86cea95ff0b6ca3747b8493fe018a0d37430bcf"} Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.950684 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.950707 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1aae20624cf4f8dfb3e9bc0fa86cea95ff0b6ca3747b8493fe018a0d37430bcf" Mar 09 13:34:49 crc kubenswrapper[4764]: I0309 13:34:49.503538 4764 scope.go:117] "RemoveContainer" containerID="f44561c47745677a2b6e923d3f449a7c01740fc8b6e465eeb90cec0a7d1ebe67" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.200288 4764 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.726608 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf"] Mar 09 13:34:51 crc kubenswrapper[4764]: E0309 13:34:51.726974 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="704ddae7-42eb-4609-b4a3-64d5078c2126" containerName="util" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.727000 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="704ddae7-42eb-4609-b4a3-64d5078c2126" containerName="util" Mar 09 13:34:51 crc kubenswrapper[4764]: E0309 13:34:51.727032 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="704ddae7-42eb-4609-b4a3-64d5078c2126" containerName="extract" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.727040 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="704ddae7-42eb-4609-b4a3-64d5078c2126" containerName="extract" Mar 09 13:34:51 crc kubenswrapper[4764]: E0309 13:34:51.727055 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="034371f5-4d6d-4a44-9678-9093ffaf3f9d" containerName="oc" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.727063 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="034371f5-4d6d-4a44-9678-9093ffaf3f9d" containerName="oc" Mar 09 13:34:51 crc kubenswrapper[4764]: E0309 13:34:51.727078 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="704ddae7-42eb-4609-b4a3-64d5078c2126" containerName="pull" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.727089 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="704ddae7-42eb-4609-b4a3-64d5078c2126" containerName="pull" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.727224 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="034371f5-4d6d-4a44-9678-9093ffaf3f9d" containerName="oc" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.727236 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="704ddae7-42eb-4609-b4a3-64d5078c2126" containerName="extract" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.727826 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.729797 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-hmssh" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.731114 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.731338 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.737928 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf"] Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.783172 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhmc4\" (UniqueName: \"kubernetes.io/projected/33e9b814-6368-46c6-aae2-5a3df1839d29-kube-api-access-nhmc4\") pod \"nmstate-operator-75c5dccd6c-q5kvf\" (UID: \"33e9b814-6368-46c6-aae2-5a3df1839d29\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.885340 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhmc4\" (UniqueName: \"kubernetes.io/projected/33e9b814-6368-46c6-aae2-5a3df1839d29-kube-api-access-nhmc4\") pod \"nmstate-operator-75c5dccd6c-q5kvf\" (UID: \"33e9b814-6368-46c6-aae2-5a3df1839d29\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.906563 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhmc4\" (UniqueName: \"kubernetes.io/projected/33e9b814-6368-46c6-aae2-5a3df1839d29-kube-api-access-nhmc4\") pod \"nmstate-operator-75c5dccd6c-q5kvf\" (UID: \"33e9b814-6368-46c6-aae2-5a3df1839d29\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf" Mar 09 13:34:52 crc kubenswrapper[4764]: I0309 13:34:52.046487 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf" Mar 09 13:34:52 crc kubenswrapper[4764]: I0309 13:34:52.275034 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf"] Mar 09 13:34:53 crc kubenswrapper[4764]: I0309 13:34:53.008207 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf" event={"ID":"33e9b814-6368-46c6-aae2-5a3df1839d29","Type":"ContainerStarted","Data":"a9e4a4aa7dba2e5c4d2a5d2a874c3b217272816a0a21b27f82cdc46a0c24caa7"} Mar 09 13:34:55 crc kubenswrapper[4764]: I0309 13:34:55.026667 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf" event={"ID":"33e9b814-6368-46c6-aae2-5a3df1839d29","Type":"ContainerStarted","Data":"ba9af486bc1be26b8631581b44cec1c0e3ba996aec7be5a13baa05e7eb699625"} Mar 09 13:34:55 crc kubenswrapper[4764]: I0309 13:34:55.051097 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf" podStartSLOduration=1.788189456 podStartE2EDuration="4.05107395s" podCreationTimestamp="2026-03-09 13:34:51 +0000 UTC" firstStartedPulling="2026-03-09 13:34:52.293604296 +0000 UTC m=+847.543776204" lastFinishedPulling="2026-03-09 13:34:54.55648879 +0000 UTC m=+849.806660698" observedRunningTime="2026-03-09 13:34:55.044157164 +0000 UTC m=+850.294329092" watchObservedRunningTime="2026-03-09 13:34:55.05107395 +0000 UTC m=+850.301245858" Mar 09 13:34:58 crc kubenswrapper[4764]: I0309 13:34:58.370791 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:34:58 crc kubenswrapper[4764]: I0309 13:34:58.371966 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:34:58 crc kubenswrapper[4764]: I0309 13:34:58.372062 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:34:58 crc kubenswrapper[4764]: I0309 13:34:58.372956 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3e004d46e664a823c675c177e537cdd2b21dcfc6ff9afcb1b390da1939340817"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:34:58 crc kubenswrapper[4764]: I0309 13:34:58.373022 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://3e004d46e664a823c675c177e537cdd2b21dcfc6ff9afcb1b390da1939340817" gracePeriod=600 Mar 09 13:34:59 crc kubenswrapper[4764]: I0309 13:34:59.053140 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="3e004d46e664a823c675c177e537cdd2b21dcfc6ff9afcb1b390da1939340817" exitCode=0 Mar 09 13:34:59 crc kubenswrapper[4764]: I0309 13:34:59.053609 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"3e004d46e664a823c675c177e537cdd2b21dcfc6ff9afcb1b390da1939340817"} Mar 09 13:34:59 crc kubenswrapper[4764]: I0309 13:34:59.053638 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"bd218b04a62035d950083134e237631295493daecba2baa1eef096560f891819"} Mar 09 13:34:59 crc kubenswrapper[4764]: I0309 13:34:59.053678 4764 scope.go:117] "RemoveContainer" containerID="6be6253041773ccdd98bbdfbbb5e4447fbae62fa99ca2320328889b4671f6c14" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.130400 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-jschs"] Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.132250 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-jschs" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.133849 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-wbdq9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.135494 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-wv755"] Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.136496 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.138104 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.150480 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-wv755"] Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.167800 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-sl5hn"] Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.168807 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.191272 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-jschs"] Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.214413 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6dc5759c-db8c-4025-bc16-a07e4dc6278a-dbus-socket\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.214466 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6dc5759c-db8c-4025-bc16-a07e4dc6278a-nmstate-lock\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.214511 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6t2l\" (UniqueName: \"kubernetes.io/projected/f339e495-f347-45b8-b9da-2cd832ac4300-kube-api-access-x6t2l\") pod \"nmstate-webhook-786f45cff4-wv755\" (UID: \"f339e495-f347-45b8-b9da-2cd832ac4300\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.214560 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6dc5759c-db8c-4025-bc16-a07e4dc6278a-ovs-socket\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.214579 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f339e495-f347-45b8-b9da-2cd832ac4300-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-wv755\" (UID: \"f339e495-f347-45b8-b9da-2cd832ac4300\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.214600 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf2p2\" (UniqueName: \"kubernetes.io/projected/6dc5759c-db8c-4025-bc16-a07e4dc6278a-kube-api-access-tf2p2\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.214624 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8lfj\" (UniqueName: \"kubernetes.io/projected/abca721f-d47f-4e38-ab9e-0832de2c70e6-kube-api-access-n8lfj\") pod \"nmstate-metrics-69594cc75-jschs\" (UID: \"abca721f-d47f-4e38-ab9e-0832de2c70e6\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-jschs" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.271857 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4"] Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.272614 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.275562 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.275810 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.275929 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-94p9n" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.291437 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4"] Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.315884 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf2p2\" (UniqueName: \"kubernetes.io/projected/6dc5759c-db8c-4025-bc16-a07e4dc6278a-kube-api-access-tf2p2\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.315950 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8lfj\" (UniqueName: \"kubernetes.io/projected/abca721f-d47f-4e38-ab9e-0832de2c70e6-kube-api-access-n8lfj\") pod \"nmstate-metrics-69594cc75-jschs\" (UID: \"abca721f-d47f-4e38-ab9e-0832de2c70e6\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-jschs" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.315978 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6rmd\" (UniqueName: \"kubernetes.io/projected/fc521772-06d5-47ec-85d0-6162bb98af30-kube-api-access-r6rmd\") pod \"nmstate-console-plugin-5dcbbd79cf-qpjz4\" (UID: \"fc521772-06d5-47ec-85d0-6162bb98af30\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.316014 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc521772-06d5-47ec-85d0-6162bb98af30-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-qpjz4\" (UID: \"fc521772-06d5-47ec-85d0-6162bb98af30\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.316044 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fc521772-06d5-47ec-85d0-6162bb98af30-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-qpjz4\" (UID: \"fc521772-06d5-47ec-85d0-6162bb98af30\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.316560 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6dc5759c-db8c-4025-bc16-a07e4dc6278a-dbus-socket\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.316591 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6dc5759c-db8c-4025-bc16-a07e4dc6278a-nmstate-lock\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.317036 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6t2l\" (UniqueName: \"kubernetes.io/projected/f339e495-f347-45b8-b9da-2cd832ac4300-kube-api-access-x6t2l\") pod \"nmstate-webhook-786f45cff4-wv755\" (UID: \"f339e495-f347-45b8-b9da-2cd832ac4300\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.316776 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6dc5759c-db8c-4025-bc16-a07e4dc6278a-nmstate-lock\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.316977 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6dc5759c-db8c-4025-bc16-a07e4dc6278a-dbus-socket\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.317382 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6dc5759c-db8c-4025-bc16-a07e4dc6278a-ovs-socket\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.317448 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6dc5759c-db8c-4025-bc16-a07e4dc6278a-ovs-socket\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.317415 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f339e495-f347-45b8-b9da-2cd832ac4300-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-wv755\" (UID: \"f339e495-f347-45b8-b9da-2cd832ac4300\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:00 crc kubenswrapper[4764]: E0309 13:35:00.317590 4764 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 09 13:35:00 crc kubenswrapper[4764]: E0309 13:35:00.317944 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f339e495-f347-45b8-b9da-2cd832ac4300-tls-key-pair podName:f339e495-f347-45b8-b9da-2cd832ac4300 nodeName:}" failed. No retries permitted until 2026-03-09 13:35:00.817635187 +0000 UTC m=+856.067807095 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/f339e495-f347-45b8-b9da-2cd832ac4300-tls-key-pair") pod "nmstate-webhook-786f45cff4-wv755" (UID: "f339e495-f347-45b8-b9da-2cd832ac4300") : secret "openshift-nmstate-webhook" not found Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.338310 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf2p2\" (UniqueName: \"kubernetes.io/projected/6dc5759c-db8c-4025-bc16-a07e4dc6278a-kube-api-access-tf2p2\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.339032 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6t2l\" (UniqueName: \"kubernetes.io/projected/f339e495-f347-45b8-b9da-2cd832ac4300-kube-api-access-x6t2l\") pod \"nmstate-webhook-786f45cff4-wv755\" (UID: \"f339e495-f347-45b8-b9da-2cd832ac4300\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.339716 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8lfj\" (UniqueName: \"kubernetes.io/projected/abca721f-d47f-4e38-ab9e-0832de2c70e6-kube-api-access-n8lfj\") pod \"nmstate-metrics-69594cc75-jschs\" (UID: \"abca721f-d47f-4e38-ab9e-0832de2c70e6\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-jschs" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.419865 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6rmd\" (UniqueName: \"kubernetes.io/projected/fc521772-06d5-47ec-85d0-6162bb98af30-kube-api-access-r6rmd\") pod \"nmstate-console-plugin-5dcbbd79cf-qpjz4\" (UID: \"fc521772-06d5-47ec-85d0-6162bb98af30\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.420303 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc521772-06d5-47ec-85d0-6162bb98af30-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-qpjz4\" (UID: \"fc521772-06d5-47ec-85d0-6162bb98af30\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.420339 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fc521772-06d5-47ec-85d0-6162bb98af30-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-qpjz4\" (UID: \"fc521772-06d5-47ec-85d0-6162bb98af30\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:00 crc kubenswrapper[4764]: E0309 13:35:00.420455 4764 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 09 13:35:00 crc kubenswrapper[4764]: E0309 13:35:00.420522 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc521772-06d5-47ec-85d0-6162bb98af30-plugin-serving-cert podName:fc521772-06d5-47ec-85d0-6162bb98af30 nodeName:}" failed. No retries permitted until 2026-03-09 13:35:00.920498141 +0000 UTC m=+856.170670049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/fc521772-06d5-47ec-85d0-6162bb98af30-plugin-serving-cert") pod "nmstate-console-plugin-5dcbbd79cf-qpjz4" (UID: "fc521772-06d5-47ec-85d0-6162bb98af30") : secret "plugin-serving-cert" not found Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.421577 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fc521772-06d5-47ec-85d0-6162bb98af30-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-qpjz4\" (UID: \"fc521772-06d5-47ec-85d0-6162bb98af30\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.441587 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6rmd\" (UniqueName: \"kubernetes.io/projected/fc521772-06d5-47ec-85d0-6162bb98af30-kube-api-access-r6rmd\") pod \"nmstate-console-plugin-5dcbbd79cf-qpjz4\" (UID: \"fc521772-06d5-47ec-85d0-6162bb98af30\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.479239 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d959df5d8-nhrp9"] Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.480398 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.504566 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d959df5d8-nhrp9"] Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.515212 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-jschs" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.522340 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-console-config\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.522420 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9t7k\" (UniqueName: \"kubernetes.io/projected/ff3abc33-b00a-400d-b1fb-c22dc5faf810-kube-api-access-w9t7k\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.522509 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff3abc33-b00a-400d-b1fb-c22dc5faf810-console-serving-cert\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.522581 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-trusted-ca-bundle\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.522688 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-oauth-serving-cert\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.522737 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-service-ca\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.522790 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff3abc33-b00a-400d-b1fb-c22dc5faf810-console-oauth-config\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.534410 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.623715 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-console-config\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.623763 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9t7k\" (UniqueName: \"kubernetes.io/projected/ff3abc33-b00a-400d-b1fb-c22dc5faf810-kube-api-access-w9t7k\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.623811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff3abc33-b00a-400d-b1fb-c22dc5faf810-console-serving-cert\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.623886 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-trusted-ca-bundle\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.623907 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-oauth-serving-cert\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.623935 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-service-ca\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.623965 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff3abc33-b00a-400d-b1fb-c22dc5faf810-console-oauth-config\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.625350 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-console-config\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.630072 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff3abc33-b00a-400d-b1fb-c22dc5faf810-console-serving-cert\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.631465 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-service-ca\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.631522 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-oauth-serving-cert\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.631475 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-trusted-ca-bundle\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.631618 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff3abc33-b00a-400d-b1fb-c22dc5faf810-console-oauth-config\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.647981 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9t7k\" (UniqueName: \"kubernetes.io/projected/ff3abc33-b00a-400d-b1fb-c22dc5faf810-kube-api-access-w9t7k\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.801210 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.828187 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f339e495-f347-45b8-b9da-2cd832ac4300-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-wv755\" (UID: \"f339e495-f347-45b8-b9da-2cd832ac4300\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.840081 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-jschs"] Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.840957 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f339e495-f347-45b8-b9da-2cd832ac4300-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-wv755\" (UID: \"f339e495-f347-45b8-b9da-2cd832ac4300\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.859463 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.929581 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc521772-06d5-47ec-85d0-6162bb98af30-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-qpjz4\" (UID: \"fc521772-06d5-47ec-85d0-6162bb98af30\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.933831 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc521772-06d5-47ec-85d0-6162bb98af30-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-qpjz4\" (UID: \"fc521772-06d5-47ec-85d0-6162bb98af30\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:01 crc kubenswrapper[4764]: I0309 13:35:01.005227 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d959df5d8-nhrp9"] Mar 09 13:35:01 crc kubenswrapper[4764]: W0309 13:35:01.013521 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff3abc33_b00a_400d_b1fb_c22dc5faf810.slice/crio-3de88e0490d67bb2012521e7663e9aaa9bc48e47450b3d17f45e0af89f24ff97 WatchSource:0}: Error finding container 3de88e0490d67bb2012521e7663e9aaa9bc48e47450b3d17f45e0af89f24ff97: Status 404 returned error can't find the container with id 3de88e0490d67bb2012521e7663e9aaa9bc48e47450b3d17f45e0af89f24ff97 Mar 09 13:35:01 crc kubenswrapper[4764]: I0309 13:35:01.074364 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-sl5hn" event={"ID":"6dc5759c-db8c-4025-bc16-a07e4dc6278a","Type":"ContainerStarted","Data":"87c879ff16aebfb3cdb302cf6fd8844416a459a934a74b6cd69bccff7b2c00b2"} Mar 09 13:35:01 crc kubenswrapper[4764]: I0309 13:35:01.075270 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-jschs" event={"ID":"abca721f-d47f-4e38-ab9e-0832de2c70e6","Type":"ContainerStarted","Data":"8d86efa7254330940104bfd148bae8dd455ff209b9ba97e02d72d3ff20ba27b3"} Mar 09 13:35:01 crc kubenswrapper[4764]: I0309 13:35:01.076759 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d959df5d8-nhrp9" event={"ID":"ff3abc33-b00a-400d-b1fb-c22dc5faf810","Type":"ContainerStarted","Data":"3de88e0490d67bb2012521e7663e9aaa9bc48e47450b3d17f45e0af89f24ff97"} Mar 09 13:35:01 crc kubenswrapper[4764]: I0309 13:35:01.125305 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:01 crc kubenswrapper[4764]: I0309 13:35:01.189277 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:01 crc kubenswrapper[4764]: I0309 13:35:01.340764 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-wv755"] Mar 09 13:35:01 crc kubenswrapper[4764]: W0309 13:35:01.354125 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf339e495_f347_45b8_b9da_2cd832ac4300.slice/crio-9b42941dc94448878e40daa831a6ee288803393ccf2709213365cb94122d1321 WatchSource:0}: Error finding container 9b42941dc94448878e40daa831a6ee288803393ccf2709213365cb94122d1321: Status 404 returned error can't find the container with id 9b42941dc94448878e40daa831a6ee288803393ccf2709213365cb94122d1321 Mar 09 13:35:01 crc kubenswrapper[4764]: I0309 13:35:01.460285 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4"] Mar 09 13:35:01 crc kubenswrapper[4764]: W0309 13:35:01.465331 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc521772_06d5_47ec_85d0_6162bb98af30.slice/crio-410a51d1864d9bb6349cb99f0e102fe5fd1af4ad08cc90a661f9e6cbb1a1b5f7 WatchSource:0}: Error finding container 410a51d1864d9bb6349cb99f0e102fe5fd1af4ad08cc90a661f9e6cbb1a1b5f7: Status 404 returned error can't find the container with id 410a51d1864d9bb6349cb99f0e102fe5fd1af4ad08cc90a661f9e6cbb1a1b5f7 Mar 09 13:35:02 crc kubenswrapper[4764]: I0309 13:35:02.085812 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d959df5d8-nhrp9" event={"ID":"ff3abc33-b00a-400d-b1fb-c22dc5faf810","Type":"ContainerStarted","Data":"a3c715fa70c37d03b3eb50dc6f1fd611ac22eb383a18eec7881d70c171d35559"} Mar 09 13:35:02 crc kubenswrapper[4764]: I0309 13:35:02.087166 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" event={"ID":"f339e495-f347-45b8-b9da-2cd832ac4300","Type":"ContainerStarted","Data":"9b42941dc94448878e40daa831a6ee288803393ccf2709213365cb94122d1321"} Mar 09 13:35:02 crc kubenswrapper[4764]: I0309 13:35:02.088303 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" event={"ID":"fc521772-06d5-47ec-85d0-6162bb98af30","Type":"ContainerStarted","Data":"410a51d1864d9bb6349cb99f0e102fe5fd1af4ad08cc90a661f9e6cbb1a1b5f7"} Mar 09 13:35:02 crc kubenswrapper[4764]: I0309 13:35:02.120341 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d959df5d8-nhrp9" podStartSLOduration=2.120316716 podStartE2EDuration="2.120316716s" podCreationTimestamp="2026-03-09 13:35:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:35:02.11039783 +0000 UTC m=+857.360569748" watchObservedRunningTime="2026-03-09 13:35:02.120316716 +0000 UTC m=+857.370488634" Mar 09 13:35:05 crc kubenswrapper[4764]: I0309 13:35:05.122191 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" event={"ID":"f339e495-f347-45b8-b9da-2cd832ac4300","Type":"ContainerStarted","Data":"08254217e02f70211175f7112a5ef7ce9b4439544efc3e7084840e3bd68ef37a"} Mar 09 13:35:05 crc kubenswrapper[4764]: I0309 13:35:05.126325 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-sl5hn" event={"ID":"6dc5759c-db8c-4025-bc16-a07e4dc6278a","Type":"ContainerStarted","Data":"81432e70fc72636dc4294964eb5847dd4fbe6ec8ddf9dbc0d515b0c84d2c789f"} Mar 09 13:35:05 crc kubenswrapper[4764]: I0309 13:35:05.126533 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:05 crc kubenswrapper[4764]: I0309 13:35:05.129115 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-jschs" event={"ID":"abca721f-d47f-4e38-ab9e-0832de2c70e6","Type":"ContainerStarted","Data":"67bd949603aca51a6bcd4d16a552df568393a8b48f3ccb5ec5991adf6aaf86b8"} Mar 09 13:35:05 crc kubenswrapper[4764]: I0309 13:35:05.131406 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" event={"ID":"fc521772-06d5-47ec-85d0-6162bb98af30","Type":"ContainerStarted","Data":"2cb071964e579b6015b8222b504fe82fd5087ddb44f756db99776aa34099944d"} Mar 09 13:35:05 crc kubenswrapper[4764]: I0309 13:35:05.179344 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" podStartSLOduration=2.007996908 podStartE2EDuration="5.179319824s" podCreationTimestamp="2026-03-09 13:35:00 +0000 UTC" firstStartedPulling="2026-03-09 13:35:01.360424028 +0000 UTC m=+856.610595936" lastFinishedPulling="2026-03-09 13:35:04.531746924 +0000 UTC m=+859.781918852" observedRunningTime="2026-03-09 13:35:05.157072036 +0000 UTC m=+860.407243964" watchObservedRunningTime="2026-03-09 13:35:05.179319824 +0000 UTC m=+860.429491732" Mar 09 13:35:05 crc kubenswrapper[4764]: I0309 13:35:05.181772 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-sl5hn" podStartSLOduration=1.253065872 podStartE2EDuration="5.181764869s" podCreationTimestamp="2026-03-09 13:35:00 +0000 UTC" firstStartedPulling="2026-03-09 13:35:00.631055619 +0000 UTC m=+855.881227527" lastFinishedPulling="2026-03-09 13:35:04.559754616 +0000 UTC m=+859.809926524" observedRunningTime="2026-03-09 13:35:05.174797712 +0000 UTC m=+860.424969620" watchObservedRunningTime="2026-03-09 13:35:05.181764869 +0000 UTC m=+860.431936777" Mar 09 13:35:05 crc kubenswrapper[4764]: I0309 13:35:05.209966 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" podStartSLOduration=2.156406564 podStartE2EDuration="5.209938826s" podCreationTimestamp="2026-03-09 13:35:00 +0000 UTC" firstStartedPulling="2026-03-09 13:35:01.467675379 +0000 UTC m=+856.717847287" lastFinishedPulling="2026-03-09 13:35:04.521207621 +0000 UTC m=+859.771379549" observedRunningTime="2026-03-09 13:35:05.192985861 +0000 UTC m=+860.443157789" watchObservedRunningTime="2026-03-09 13:35:05.209938826 +0000 UTC m=+860.460110734" Mar 09 13:35:06 crc kubenswrapper[4764]: I0309 13:35:06.137636 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:07 crc kubenswrapper[4764]: I0309 13:35:07.145991 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-jschs" event={"ID":"abca721f-d47f-4e38-ab9e-0832de2c70e6","Type":"ContainerStarted","Data":"89b16c77fb82d30b0368a6d05c390d89970c7f7ed24264d1bcd3aed56be68a9e"} Mar 09 13:35:07 crc kubenswrapper[4764]: I0309 13:35:07.165340 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-jschs" podStartSLOduration=1.167404181 podStartE2EDuration="7.165312679s" podCreationTimestamp="2026-03-09 13:35:00 +0000 UTC" firstStartedPulling="2026-03-09 13:35:00.859194859 +0000 UTC m=+856.109366767" lastFinishedPulling="2026-03-09 13:35:06.857103357 +0000 UTC m=+862.107275265" observedRunningTime="2026-03-09 13:35:07.160690215 +0000 UTC m=+862.410862133" watchObservedRunningTime="2026-03-09 13:35:07.165312679 +0000 UTC m=+862.415484587" Mar 09 13:35:10 crc kubenswrapper[4764]: I0309 13:35:10.556758 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:10 crc kubenswrapper[4764]: I0309 13:35:10.802961 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:10 crc kubenswrapper[4764]: I0309 13:35:10.803029 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:10 crc kubenswrapper[4764]: I0309 13:35:10.808111 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:11 crc kubenswrapper[4764]: I0309 13:35:11.174862 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:11 crc kubenswrapper[4764]: I0309 13:35:11.232492 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8g9lj"] Mar 09 13:35:21 crc kubenswrapper[4764]: I0309 13:35:21.132530 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:34 crc kubenswrapper[4764]: I0309 13:35:34.988552 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg"] Mar 09 13:35:34 crc kubenswrapper[4764]: I0309 13:35:34.991497 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:34 crc kubenswrapper[4764]: I0309 13:35:34.993711 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 13:35:34 crc kubenswrapper[4764]: I0309 13:35:34.994867 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg"] Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.019158 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.019253 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9cl5\" (UniqueName: \"kubernetes.io/projected/413f45cc-5916-45bb-a2a2-7b33029445af-kube-api-access-q9cl5\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.019380 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.121120 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.121209 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9cl5\" (UniqueName: \"kubernetes.io/projected/413f45cc-5916-45bb-a2a2-7b33029445af-kube-api-access-q9cl5\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.121294 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.122131 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.122263 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.157260 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9cl5\" (UniqueName: \"kubernetes.io/projected/413f45cc-5916-45bb-a2a2-7b33029445af-kube-api-access-q9cl5\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.316726 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.757609 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg"] Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.281100 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-8g9lj" podUID="a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" containerName="console" containerID="cri-o://472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99" gracePeriod=15 Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.330374 4764 generic.go:334] "Generic (PLEG): container finished" podID="413f45cc-5916-45bb-a2a2-7b33029445af" containerID="b1914f15bf2404060d84165094e65d28117ea53289297f26ad60197b1fda3e40" exitCode=0 Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.330447 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" event={"ID":"413f45cc-5916-45bb-a2a2-7b33029445af","Type":"ContainerDied","Data":"b1914f15bf2404060d84165094e65d28117ea53289297f26ad60197b1fda3e40"} Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.330509 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" event={"ID":"413f45cc-5916-45bb-a2a2-7b33029445af","Type":"ContainerStarted","Data":"72224442d8862bb7b64a86203fad18ff6f25d0b361354db122490f417650fe17"} Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.631216 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8g9lj_a1b4dc0b-edea-4c0d-8d61-3e3d3133605d/console/0.log" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.631282 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.741243 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-config\") pod \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.741312 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f9zl\" (UniqueName: \"kubernetes.io/projected/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-kube-api-access-9f9zl\") pod \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.741349 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-service-ca\") pod \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.741382 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-serving-cert\") pod \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.741413 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-oauth-config\") pod \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.741429 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-trusted-ca-bundle\") pod \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.741467 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-oauth-serving-cert\") pod \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.742048 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-config" (OuterVolumeSpecName: "console-config") pod "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" (UID: "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.742061 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" (UID: "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.742374 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" (UID: "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.742369 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-service-ca" (OuterVolumeSpecName: "service-ca") pod "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" (UID: "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.746792 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-kube-api-access-9f9zl" (OuterVolumeSpecName: "kube-api-access-9f9zl") pod "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" (UID: "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d"). InnerVolumeSpecName "kube-api-access-9f9zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.747029 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" (UID: "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.747199 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" (UID: "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.842688 4764 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.842731 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f9zl\" (UniqueName: \"kubernetes.io/projected/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-kube-api-access-9f9zl\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.842742 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.842751 4764 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.842762 4764 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.842770 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.842779 4764 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.338411 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8g9lj_a1b4dc0b-edea-4c0d-8d61-3e3d3133605d/console/0.log" Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.338478 4764 generic.go:334] "Generic (PLEG): container finished" podID="a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" containerID="472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99" exitCode=2 Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.338512 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8g9lj" event={"ID":"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d","Type":"ContainerDied","Data":"472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99"} Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.338544 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8g9lj" event={"ID":"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d","Type":"ContainerDied","Data":"74b142593d2c61fe165941847cdc507401c02d77a28ca82096827ce65a85b3e5"} Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.338563 4764 scope.go:117] "RemoveContainer" containerID="472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99" Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.338722 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.385739 4764 scope.go:117] "RemoveContainer" containerID="472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99" Mar 09 13:35:37 crc kubenswrapper[4764]: E0309 13:35:37.386149 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99\": container with ID starting with 472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99 not found: ID does not exist" containerID="472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99" Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.386193 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99"} err="failed to get container status \"472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99\": rpc error: code = NotFound desc = could not find container \"472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99\": container with ID starting with 472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99 not found: ID does not exist" Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.387446 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8g9lj"] Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.394605 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-8g9lj"] Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.568070 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" path="/var/lib/kubelet/pods/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d/volumes" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.330304 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vtshp"] Mar 09 13:35:38 crc kubenswrapper[4764]: E0309 13:35:38.330608 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" containerName="console" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.330630 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" containerName="console" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.330770 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" containerName="console" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.332130 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.343616 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vtshp"] Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.368722 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-utilities\") pod \"redhat-operators-vtshp\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.368802 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-catalog-content\") pod \"redhat-operators-vtshp\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.368955 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55cnb\" (UniqueName: \"kubernetes.io/projected/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-kube-api-access-55cnb\") pod \"redhat-operators-vtshp\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.471214 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55cnb\" (UniqueName: \"kubernetes.io/projected/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-kube-api-access-55cnb\") pod \"redhat-operators-vtshp\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.471310 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-utilities\") pod \"redhat-operators-vtshp\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.471333 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-catalog-content\") pod \"redhat-operators-vtshp\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.472112 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-catalog-content\") pod \"redhat-operators-vtshp\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.472792 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-utilities\") pod \"redhat-operators-vtshp\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.496898 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55cnb\" (UniqueName: \"kubernetes.io/projected/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-kube-api-access-55cnb\") pod \"redhat-operators-vtshp\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.662431 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.932373 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vtshp"] Mar 09 13:35:38 crc kubenswrapper[4764]: W0309 13:35:38.977585 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ebb4d3c_cf42_4cc4_9856_8bdbfffbd53a.slice/crio-b1aabe863f1f8bacecabae737d4453f4b2305ab4eb3778cf0d6e8b21552025a9 WatchSource:0}: Error finding container b1aabe863f1f8bacecabae737d4453f4b2305ab4eb3778cf0d6e8b21552025a9: Status 404 returned error can't find the container with id b1aabe863f1f8bacecabae737d4453f4b2305ab4eb3778cf0d6e8b21552025a9 Mar 09 13:35:39 crc kubenswrapper[4764]: I0309 13:35:39.365187 4764 generic.go:334] "Generic (PLEG): container finished" podID="413f45cc-5916-45bb-a2a2-7b33029445af" containerID="12e5be17dde6662ac78b5ccfb4597655e2de81cd358d4b05d3b1385bc9e64d2a" exitCode=0 Mar 09 13:35:39 crc kubenswrapper[4764]: I0309 13:35:39.365257 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" event={"ID":"413f45cc-5916-45bb-a2a2-7b33029445af","Type":"ContainerDied","Data":"12e5be17dde6662ac78b5ccfb4597655e2de81cd358d4b05d3b1385bc9e64d2a"} Mar 09 13:35:39 crc kubenswrapper[4764]: I0309 13:35:39.366915 4764 generic.go:334] "Generic (PLEG): container finished" podID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerID="cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0" exitCode=0 Mar 09 13:35:39 crc kubenswrapper[4764]: I0309 13:35:39.366979 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vtshp" event={"ID":"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a","Type":"ContainerDied","Data":"cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0"} Mar 09 13:35:39 crc kubenswrapper[4764]: I0309 13:35:39.367016 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vtshp" event={"ID":"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a","Type":"ContainerStarted","Data":"b1aabe863f1f8bacecabae737d4453f4b2305ab4eb3778cf0d6e8b21552025a9"} Mar 09 13:35:40 crc kubenswrapper[4764]: I0309 13:35:40.376159 4764 generic.go:334] "Generic (PLEG): container finished" podID="413f45cc-5916-45bb-a2a2-7b33029445af" containerID="7be3f3fd424c604b44633af0868d3f5a689458b5777c17c82d3868752bae1dd5" exitCode=0 Mar 09 13:35:40 crc kubenswrapper[4764]: I0309 13:35:40.376263 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" event={"ID":"413f45cc-5916-45bb-a2a2-7b33029445af","Type":"ContainerDied","Data":"7be3f3fd424c604b44633af0868d3f5a689458b5777c17c82d3868752bae1dd5"} Mar 09 13:35:41 crc kubenswrapper[4764]: I0309 13:35:41.643448 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:41 crc kubenswrapper[4764]: I0309 13:35:41.717230 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9cl5\" (UniqueName: \"kubernetes.io/projected/413f45cc-5916-45bb-a2a2-7b33029445af-kube-api-access-q9cl5\") pod \"413f45cc-5916-45bb-a2a2-7b33029445af\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " Mar 09 13:35:41 crc kubenswrapper[4764]: I0309 13:35:41.717471 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-bundle\") pod \"413f45cc-5916-45bb-a2a2-7b33029445af\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " Mar 09 13:35:41 crc kubenswrapper[4764]: I0309 13:35:41.717503 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-util\") pod \"413f45cc-5916-45bb-a2a2-7b33029445af\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " Mar 09 13:35:41 crc kubenswrapper[4764]: I0309 13:35:41.718728 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-bundle" (OuterVolumeSpecName: "bundle") pod "413f45cc-5916-45bb-a2a2-7b33029445af" (UID: "413f45cc-5916-45bb-a2a2-7b33029445af"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:35:41 crc kubenswrapper[4764]: I0309 13:35:41.725385 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/413f45cc-5916-45bb-a2a2-7b33029445af-kube-api-access-q9cl5" (OuterVolumeSpecName: "kube-api-access-q9cl5") pod "413f45cc-5916-45bb-a2a2-7b33029445af" (UID: "413f45cc-5916-45bb-a2a2-7b33029445af"). InnerVolumeSpecName "kube-api-access-q9cl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:35:41 crc kubenswrapper[4764]: I0309 13:35:41.727717 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-util" (OuterVolumeSpecName: "util") pod "413f45cc-5916-45bb-a2a2-7b33029445af" (UID: "413f45cc-5916-45bb-a2a2-7b33029445af"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:35:41 crc kubenswrapper[4764]: I0309 13:35:41.819603 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9cl5\" (UniqueName: \"kubernetes.io/projected/413f45cc-5916-45bb-a2a2-7b33029445af-kube-api-access-q9cl5\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:41 crc kubenswrapper[4764]: I0309 13:35:41.819659 4764 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:41 crc kubenswrapper[4764]: I0309 13:35:41.819669 4764 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-util\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:42 crc kubenswrapper[4764]: I0309 13:35:42.394224 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" event={"ID":"413f45cc-5916-45bb-a2a2-7b33029445af","Type":"ContainerDied","Data":"72224442d8862bb7b64a86203fad18ff6f25d0b361354db122490f417650fe17"} Mar 09 13:35:42 crc kubenswrapper[4764]: I0309 13:35:42.394630 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72224442d8862bb7b64a86203fad18ff6f25d0b361354db122490f417650fe17" Mar 09 13:35:42 crc kubenswrapper[4764]: I0309 13:35:42.394301 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:45 crc kubenswrapper[4764]: I0309 13:35:45.412231 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vtshp" event={"ID":"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a","Type":"ContainerStarted","Data":"865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9"} Mar 09 13:35:45 crc kubenswrapper[4764]: E0309 13:35:45.769747 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ebb4d3c_cf42_4cc4_9856_8bdbfffbd53a.slice/crio-conmon-865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:35:46 crc kubenswrapper[4764]: I0309 13:35:46.418919 4764 generic.go:334] "Generic (PLEG): container finished" podID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerID="865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9" exitCode=0 Mar 09 13:35:46 crc kubenswrapper[4764]: I0309 13:35:46.418982 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vtshp" event={"ID":"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a","Type":"ContainerDied","Data":"865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9"} Mar 09 13:35:47 crc kubenswrapper[4764]: I0309 13:35:47.430723 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vtshp" event={"ID":"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a","Type":"ContainerStarted","Data":"3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc"} Mar 09 13:35:47 crc kubenswrapper[4764]: I0309 13:35:47.464127 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vtshp" podStartSLOduration=1.89836003 podStartE2EDuration="9.464102366s" podCreationTimestamp="2026-03-09 13:35:38 +0000 UTC" firstStartedPulling="2026-03-09 13:35:39.368468971 +0000 UTC m=+894.618640889" lastFinishedPulling="2026-03-09 13:35:46.934211317 +0000 UTC m=+902.184383225" observedRunningTime="2026-03-09 13:35:47.458610379 +0000 UTC m=+902.708782297" watchObservedRunningTime="2026-03-09 13:35:47.464102366 +0000 UTC m=+902.714274274" Mar 09 13:35:48 crc kubenswrapper[4764]: I0309 13:35:48.662847 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:48 crc kubenswrapper[4764]: I0309 13:35:48.662991 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:49 crc kubenswrapper[4764]: I0309 13:35:49.702898 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vtshp" podUID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerName="registry-server" probeResult="failure" output=< Mar 09 13:35:49 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 09 13:35:49 crc kubenswrapper[4764]: > Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.931368 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-659b995f59-8s255"] Mar 09 13:35:50 crc kubenswrapper[4764]: E0309 13:35:50.931788 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="413f45cc-5916-45bb-a2a2-7b33029445af" containerName="util" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.931807 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="413f45cc-5916-45bb-a2a2-7b33029445af" containerName="util" Mar 09 13:35:50 crc kubenswrapper[4764]: E0309 13:35:50.931824 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="413f45cc-5916-45bb-a2a2-7b33029445af" containerName="extract" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.931832 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="413f45cc-5916-45bb-a2a2-7b33029445af" containerName="extract" Mar 09 13:35:50 crc kubenswrapper[4764]: E0309 13:35:50.931861 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="413f45cc-5916-45bb-a2a2-7b33029445af" containerName="pull" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.931869 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="413f45cc-5916-45bb-a2a2-7b33029445af" containerName="pull" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.932021 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="413f45cc-5916-45bb-a2a2-7b33029445af" containerName="extract" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.932630 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.937468 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.937780 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.938616 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-ldn24" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.938695 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.939876 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.946992 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-659b995f59-8s255"] Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.075854 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b89770ec-e502-4b3a-8233-8c9aa76d55de-apiservice-cert\") pod \"metallb-operator-controller-manager-659b995f59-8s255\" (UID: \"b89770ec-e502-4b3a-8233-8c9aa76d55de\") " pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.075919 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b89770ec-e502-4b3a-8233-8c9aa76d55de-webhook-cert\") pod \"metallb-operator-controller-manager-659b995f59-8s255\" (UID: \"b89770ec-e502-4b3a-8233-8c9aa76d55de\") " pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.075975 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvmg2\" (UniqueName: \"kubernetes.io/projected/b89770ec-e502-4b3a-8233-8c9aa76d55de-kube-api-access-jvmg2\") pod \"metallb-operator-controller-manager-659b995f59-8s255\" (UID: \"b89770ec-e502-4b3a-8233-8c9aa76d55de\") " pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.169277 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv"] Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.170039 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.172218 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-jtx95" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.172272 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.174694 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.177531 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b89770ec-e502-4b3a-8233-8c9aa76d55de-apiservice-cert\") pod \"metallb-operator-controller-manager-659b995f59-8s255\" (UID: \"b89770ec-e502-4b3a-8233-8c9aa76d55de\") " pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.177585 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b89770ec-e502-4b3a-8233-8c9aa76d55de-webhook-cert\") pod \"metallb-operator-controller-manager-659b995f59-8s255\" (UID: \"b89770ec-e502-4b3a-8233-8c9aa76d55de\") " pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.177635 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvmg2\" (UniqueName: \"kubernetes.io/projected/b89770ec-e502-4b3a-8233-8c9aa76d55de-kube-api-access-jvmg2\") pod \"metallb-operator-controller-manager-659b995f59-8s255\" (UID: \"b89770ec-e502-4b3a-8233-8c9aa76d55de\") " pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.185338 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b89770ec-e502-4b3a-8233-8c9aa76d55de-webhook-cert\") pod \"metallb-operator-controller-manager-659b995f59-8s255\" (UID: \"b89770ec-e502-4b3a-8233-8c9aa76d55de\") " pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.186990 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b89770ec-e502-4b3a-8233-8c9aa76d55de-apiservice-cert\") pod \"metallb-operator-controller-manager-659b995f59-8s255\" (UID: \"b89770ec-e502-4b3a-8233-8c9aa76d55de\") " pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.189620 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv"] Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.209420 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvmg2\" (UniqueName: \"kubernetes.io/projected/b89770ec-e502-4b3a-8233-8c9aa76d55de-kube-api-access-jvmg2\") pod \"metallb-operator-controller-manager-659b995f59-8s255\" (UID: \"b89770ec-e502-4b3a-8233-8c9aa76d55de\") " pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.254578 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.278591 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn56m\" (UniqueName: \"kubernetes.io/projected/ed37a5d1-5d4b-41fb-8476-189def32c909-kube-api-access-cn56m\") pod \"metallb-operator-webhook-server-7bccb4c96-wqdrv\" (UID: \"ed37a5d1-5d4b-41fb-8476-189def32c909\") " pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.278681 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed37a5d1-5d4b-41fb-8476-189def32c909-webhook-cert\") pod \"metallb-operator-webhook-server-7bccb4c96-wqdrv\" (UID: \"ed37a5d1-5d4b-41fb-8476-189def32c909\") " pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.278720 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed37a5d1-5d4b-41fb-8476-189def32c909-apiservice-cert\") pod \"metallb-operator-webhook-server-7bccb4c96-wqdrv\" (UID: \"ed37a5d1-5d4b-41fb-8476-189def32c909\") " pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.381144 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed37a5d1-5d4b-41fb-8476-189def32c909-apiservice-cert\") pod \"metallb-operator-webhook-server-7bccb4c96-wqdrv\" (UID: \"ed37a5d1-5d4b-41fb-8476-189def32c909\") " pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.381865 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn56m\" (UniqueName: \"kubernetes.io/projected/ed37a5d1-5d4b-41fb-8476-189def32c909-kube-api-access-cn56m\") pod \"metallb-operator-webhook-server-7bccb4c96-wqdrv\" (UID: \"ed37a5d1-5d4b-41fb-8476-189def32c909\") " pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.381928 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed37a5d1-5d4b-41fb-8476-189def32c909-webhook-cert\") pod \"metallb-operator-webhook-server-7bccb4c96-wqdrv\" (UID: \"ed37a5d1-5d4b-41fb-8476-189def32c909\") " pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.386060 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed37a5d1-5d4b-41fb-8476-189def32c909-apiservice-cert\") pod \"metallb-operator-webhook-server-7bccb4c96-wqdrv\" (UID: \"ed37a5d1-5d4b-41fb-8476-189def32c909\") " pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.388636 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed37a5d1-5d4b-41fb-8476-189def32c909-webhook-cert\") pod \"metallb-operator-webhook-server-7bccb4c96-wqdrv\" (UID: \"ed37a5d1-5d4b-41fb-8476-189def32c909\") " pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.406896 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn56m\" (UniqueName: \"kubernetes.io/projected/ed37a5d1-5d4b-41fb-8476-189def32c909-kube-api-access-cn56m\") pod \"metallb-operator-webhook-server-7bccb4c96-wqdrv\" (UID: \"ed37a5d1-5d4b-41fb-8476-189def32c909\") " pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.542127 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.628171 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-659b995f59-8s255"] Mar 09 13:35:51 crc kubenswrapper[4764]: W0309 13:35:51.650632 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb89770ec_e502_4b3a_8233_8c9aa76d55de.slice/crio-1c3cc6a46e1308cf3647cb168debebee0e6505ea65ce44bf3c0ba4be43dc8f42 WatchSource:0}: Error finding container 1c3cc6a46e1308cf3647cb168debebee0e6505ea65ce44bf3c0ba4be43dc8f42: Status 404 returned error can't find the container with id 1c3cc6a46e1308cf3647cb168debebee0e6505ea65ce44bf3c0ba4be43dc8f42 Mar 09 13:35:52 crc kubenswrapper[4764]: I0309 13:35:52.213540 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv"] Mar 09 13:35:52 crc kubenswrapper[4764]: W0309 13:35:52.215610 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded37a5d1_5d4b_41fb_8476_189def32c909.slice/crio-4fed1d217b5044ce3588ab59fefb0593c45f3823512ce3923b0ff444eb3f2846 WatchSource:0}: Error finding container 4fed1d217b5044ce3588ab59fefb0593c45f3823512ce3923b0ff444eb3f2846: Status 404 returned error can't find the container with id 4fed1d217b5044ce3588ab59fefb0593c45f3823512ce3923b0ff444eb3f2846 Mar 09 13:35:52 crc kubenswrapper[4764]: I0309 13:35:52.468020 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" event={"ID":"ed37a5d1-5d4b-41fb-8476-189def32c909","Type":"ContainerStarted","Data":"4fed1d217b5044ce3588ab59fefb0593c45f3823512ce3923b0ff444eb3f2846"} Mar 09 13:35:52 crc kubenswrapper[4764]: I0309 13:35:52.470099 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" event={"ID":"b89770ec-e502-4b3a-8233-8c9aa76d55de","Type":"ContainerStarted","Data":"1c3cc6a46e1308cf3647cb168debebee0e6505ea65ce44bf3c0ba4be43dc8f42"} Mar 09 13:35:58 crc kubenswrapper[4764]: I0309 13:35:58.525588 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" event={"ID":"ed37a5d1-5d4b-41fb-8476-189def32c909","Type":"ContainerStarted","Data":"dc1706c51269524f0b79241f7df5be4ff4134518ccda55d641057a2484396f4c"} Mar 09 13:35:58 crc kubenswrapper[4764]: I0309 13:35:58.526291 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:58 crc kubenswrapper[4764]: I0309 13:35:58.527270 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" event={"ID":"b89770ec-e502-4b3a-8233-8c9aa76d55de","Type":"ContainerStarted","Data":"1d3a57f678830400b5b899f6366dc418082f694e07f45a58c39ad8a42955a633"} Mar 09 13:35:58 crc kubenswrapper[4764]: I0309 13:35:58.527723 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:58 crc kubenswrapper[4764]: I0309 13:35:58.551726 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" podStartSLOduration=1.9586393439999998 podStartE2EDuration="7.551707375s" podCreationTimestamp="2026-03-09 13:35:51 +0000 UTC" firstStartedPulling="2026-03-09 13:35:52.219178747 +0000 UTC m=+907.469350655" lastFinishedPulling="2026-03-09 13:35:57.812246778 +0000 UTC m=+913.062418686" observedRunningTime="2026-03-09 13:35:58.547028849 +0000 UTC m=+913.797200777" watchObservedRunningTime="2026-03-09 13:35:58.551707375 +0000 UTC m=+913.801879283" Mar 09 13:35:58 crc kubenswrapper[4764]: I0309 13:35:58.577415 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" podStartSLOduration=2.439775926 podStartE2EDuration="8.577391851s" podCreationTimestamp="2026-03-09 13:35:50 +0000 UTC" firstStartedPulling="2026-03-09 13:35:51.65583871 +0000 UTC m=+906.906010618" lastFinishedPulling="2026-03-09 13:35:57.793454635 +0000 UTC m=+913.043626543" observedRunningTime="2026-03-09 13:35:58.576263961 +0000 UTC m=+913.826435869" watchObservedRunningTime="2026-03-09 13:35:58.577391851 +0000 UTC m=+913.827563759" Mar 09 13:35:58 crc kubenswrapper[4764]: I0309 13:35:58.704191 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:58 crc kubenswrapper[4764]: I0309 13:35:58.773138 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:59 crc kubenswrapper[4764]: I0309 13:35:59.323330 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vtshp"] Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.140480 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551056-chr2q"] Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.141609 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551056-chr2q" Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.145192 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.145207 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.145736 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.153834 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551056-chr2q"] Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.224073 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6dl4\" (UniqueName: \"kubernetes.io/projected/b5ceebdd-e9ad-472a-8806-f5b441ced89a-kube-api-access-h6dl4\") pod \"auto-csr-approver-29551056-chr2q\" (UID: \"b5ceebdd-e9ad-472a-8806-f5b441ced89a\") " pod="openshift-infra/auto-csr-approver-29551056-chr2q" Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.325339 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6dl4\" (UniqueName: \"kubernetes.io/projected/b5ceebdd-e9ad-472a-8806-f5b441ced89a-kube-api-access-h6dl4\") pod \"auto-csr-approver-29551056-chr2q\" (UID: \"b5ceebdd-e9ad-472a-8806-f5b441ced89a\") " pod="openshift-infra/auto-csr-approver-29551056-chr2q" Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.346452 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6dl4\" (UniqueName: \"kubernetes.io/projected/b5ceebdd-e9ad-472a-8806-f5b441ced89a-kube-api-access-h6dl4\") pod \"auto-csr-approver-29551056-chr2q\" (UID: \"b5ceebdd-e9ad-472a-8806-f5b441ced89a\") " pod="openshift-infra/auto-csr-approver-29551056-chr2q" Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.498520 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551056-chr2q" Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.542225 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vtshp" podUID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerName="registry-server" containerID="cri-o://3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc" gracePeriod=2 Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.789363 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551056-chr2q"] Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.911882 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.042830 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-utilities\") pod \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.042907 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-catalog-content\") pod \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.042933 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55cnb\" (UniqueName: \"kubernetes.io/projected/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-kube-api-access-55cnb\") pod \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.044278 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-utilities" (OuterVolumeSpecName: "utilities") pod "0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" (UID: "0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.048413 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-kube-api-access-55cnb" (OuterVolumeSpecName: "kube-api-access-55cnb") pod "0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" (UID: "0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a"). InnerVolumeSpecName "kube-api-access-55cnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.144478 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.144517 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55cnb\" (UniqueName: \"kubernetes.io/projected/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-kube-api-access-55cnb\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.164414 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" (UID: "0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.246513 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.549768 4764 generic.go:334] "Generic (PLEG): container finished" podID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerID="3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc" exitCode=0 Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.549823 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vtshp" event={"ID":"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a","Type":"ContainerDied","Data":"3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc"} Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.549849 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vtshp" event={"ID":"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a","Type":"ContainerDied","Data":"b1aabe863f1f8bacecabae737d4453f4b2305ab4eb3778cf0d6e8b21552025a9"} Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.549865 4764 scope.go:117] "RemoveContainer" containerID="3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.549982 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.552398 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551056-chr2q" event={"ID":"b5ceebdd-e9ad-472a-8806-f5b441ced89a","Type":"ContainerStarted","Data":"25f6a3f57e2ed3f49a95fdf9ce89a920cef1531b81bc2f3426cf66871e5a53ad"} Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.580941 4764 scope.go:117] "RemoveContainer" containerID="865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.581734 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vtshp"] Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.585474 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vtshp"] Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.599595 4764 scope.go:117] "RemoveContainer" containerID="cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.623762 4764 scope.go:117] "RemoveContainer" containerID="3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc" Mar 09 13:36:01 crc kubenswrapper[4764]: E0309 13:36:01.624447 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc\": container with ID starting with 3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc not found: ID does not exist" containerID="3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.624690 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc"} err="failed to get container status \"3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc\": rpc error: code = NotFound desc = could not find container \"3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc\": container with ID starting with 3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc not found: ID does not exist" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.624815 4764 scope.go:117] "RemoveContainer" containerID="865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9" Mar 09 13:36:01 crc kubenswrapper[4764]: E0309 13:36:01.625477 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9\": container with ID starting with 865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9 not found: ID does not exist" containerID="865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.625514 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9"} err="failed to get container status \"865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9\": rpc error: code = NotFound desc = could not find container \"865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9\": container with ID starting with 865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9 not found: ID does not exist" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.625536 4764 scope.go:117] "RemoveContainer" containerID="cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0" Mar 09 13:36:01 crc kubenswrapper[4764]: E0309 13:36:01.625855 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0\": container with ID starting with cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0 not found: ID does not exist" containerID="cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.625878 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0"} err="failed to get container status \"cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0\": rpc error: code = NotFound desc = could not find container \"cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0\": container with ID starting with cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0 not found: ID does not exist" Mar 09 13:36:02 crc kubenswrapper[4764]: I0309 13:36:02.561226 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551056-chr2q" event={"ID":"b5ceebdd-e9ad-472a-8806-f5b441ced89a","Type":"ContainerStarted","Data":"b34325cd4e2f6811cadb8554b8a6fb24248546f3c4182376f60b7c3268c9f6c7"} Mar 09 13:36:02 crc kubenswrapper[4764]: I0309 13:36:02.578858 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551056-chr2q" podStartSLOduration=1.129425532 podStartE2EDuration="2.578837346s" podCreationTimestamp="2026-03-09 13:36:00 +0000 UTC" firstStartedPulling="2026-03-09 13:36:00.811742487 +0000 UTC m=+916.061914395" lastFinishedPulling="2026-03-09 13:36:02.261154301 +0000 UTC m=+917.511326209" observedRunningTime="2026-03-09 13:36:02.575423014 +0000 UTC m=+917.825594922" watchObservedRunningTime="2026-03-09 13:36:02.578837346 +0000 UTC m=+917.829009254" Mar 09 13:36:03 crc kubenswrapper[4764]: I0309 13:36:03.567607 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" path="/var/lib/kubelet/pods/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a/volumes" Mar 09 13:36:03 crc kubenswrapper[4764]: I0309 13:36:03.569376 4764 generic.go:334] "Generic (PLEG): container finished" podID="b5ceebdd-e9ad-472a-8806-f5b441ced89a" containerID="b34325cd4e2f6811cadb8554b8a6fb24248546f3c4182376f60b7c3268c9f6c7" exitCode=0 Mar 09 13:36:03 crc kubenswrapper[4764]: I0309 13:36:03.569408 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551056-chr2q" event={"ID":"b5ceebdd-e9ad-472a-8806-f5b441ced89a","Type":"ContainerDied","Data":"b34325cd4e2f6811cadb8554b8a6fb24248546f3c4182376f60b7c3268c9f6c7"} Mar 09 13:36:04 crc kubenswrapper[4764]: I0309 13:36:04.896096 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551056-chr2q" Mar 09 13:36:05 crc kubenswrapper[4764]: I0309 13:36:05.005743 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6dl4\" (UniqueName: \"kubernetes.io/projected/b5ceebdd-e9ad-472a-8806-f5b441ced89a-kube-api-access-h6dl4\") pod \"b5ceebdd-e9ad-472a-8806-f5b441ced89a\" (UID: \"b5ceebdd-e9ad-472a-8806-f5b441ced89a\") " Mar 09 13:36:05 crc kubenswrapper[4764]: I0309 13:36:05.014845 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5ceebdd-e9ad-472a-8806-f5b441ced89a-kube-api-access-h6dl4" (OuterVolumeSpecName: "kube-api-access-h6dl4") pod "b5ceebdd-e9ad-472a-8806-f5b441ced89a" (UID: "b5ceebdd-e9ad-472a-8806-f5b441ced89a"). InnerVolumeSpecName "kube-api-access-h6dl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:36:05 crc kubenswrapper[4764]: I0309 13:36:05.107927 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6dl4\" (UniqueName: \"kubernetes.io/projected/b5ceebdd-e9ad-472a-8806-f5b441ced89a-kube-api-access-h6dl4\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:05 crc kubenswrapper[4764]: I0309 13:36:05.592228 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551056-chr2q" event={"ID":"b5ceebdd-e9ad-472a-8806-f5b441ced89a","Type":"ContainerDied","Data":"25f6a3f57e2ed3f49a95fdf9ce89a920cef1531b81bc2f3426cf66871e5a53ad"} Mar 09 13:36:05 crc kubenswrapper[4764]: I0309 13:36:05.592293 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25f6a3f57e2ed3f49a95fdf9ce89a920cef1531b81bc2f3426cf66871e5a53ad" Mar 09 13:36:05 crc kubenswrapper[4764]: I0309 13:36:05.592303 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551056-chr2q" Mar 09 13:36:05 crc kubenswrapper[4764]: I0309 13:36:05.641282 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551050-zmlhn"] Mar 09 13:36:05 crc kubenswrapper[4764]: I0309 13:36:05.645498 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551050-zmlhn"] Mar 09 13:36:07 crc kubenswrapper[4764]: I0309 13:36:07.570846 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f815cd5-462f-4994-bab1-beef4157b06e" path="/var/lib/kubelet/pods/7f815cd5-462f-4994-bab1-beef4157b06e/volumes" Mar 09 13:36:11 crc kubenswrapper[4764]: I0309 13:36:11.551436 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.256911 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.911801 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-kl47c"] Mar 09 13:36:31 crc kubenswrapper[4764]: E0309 13:36:31.912528 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerName="extract-utilities" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.912554 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerName="extract-utilities" Mar 09 13:36:31 crc kubenswrapper[4764]: E0309 13:36:31.912569 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerName="registry-server" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.912576 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerName="registry-server" Mar 09 13:36:31 crc kubenswrapper[4764]: E0309 13:36:31.912583 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5ceebdd-e9ad-472a-8806-f5b441ced89a" containerName="oc" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.912592 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ceebdd-e9ad-472a-8806-f5b441ced89a" containerName="oc" Mar 09 13:36:31 crc kubenswrapper[4764]: E0309 13:36:31.912612 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerName="extract-content" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.912620 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerName="extract-content" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.912749 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5ceebdd-e9ad-472a-8806-f5b441ced89a" containerName="oc" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.912765 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerName="registry-server" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.915154 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.918994 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.919316 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-nw2wr" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.919331 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z"] Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.920077 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.920533 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.923149 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.943061 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z"] Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.020138 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfznq\" (UniqueName: \"kubernetes.io/projected/72efa175-2568-4c62-a97e-35893887fe82-kube-api-access-bfznq\") pod \"frr-k8s-webhook-server-7f989f654f-wqd8z\" (UID: \"72efa175-2568-4c62-a97e-35893887fe82\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.020227 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-frr-conf\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.020254 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9333a95c-85e4-4e7d-a142-ae2dd06b4146-metrics-certs\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.020271 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-frr-sockets\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.020292 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-metrics\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.020315 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72efa175-2568-4c62-a97e-35893887fe82-cert\") pod \"frr-k8s-webhook-server-7f989f654f-wqd8z\" (UID: \"72efa175-2568-4c62-a97e-35893887fe82\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.020339 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-reloader\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.020383 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9333a95c-85e4-4e7d-a142-ae2dd06b4146-frr-startup\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.020404 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df4c8\" (UniqueName: \"kubernetes.io/projected/9333a95c-85e4-4e7d-a142-ae2dd06b4146-kube-api-access-df4c8\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.036929 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-2z5wp"] Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.038356 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.041494 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.041876 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.042018 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.042303 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-9bjvr" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.069186 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-lgrkv"] Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.070312 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.073143 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.081956 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-lgrkv"] Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.121760 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfznq\" (UniqueName: \"kubernetes.io/projected/72efa175-2568-4c62-a97e-35893887fe82-kube-api-access-bfznq\") pod \"frr-k8s-webhook-server-7f989f654f-wqd8z\" (UID: \"72efa175-2568-4c62-a97e-35893887fe82\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.121832 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-memberlist\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.121894 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-frr-conf\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.121912 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9333a95c-85e4-4e7d-a142-ae2dd06b4146-metrics-certs\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.121935 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-frr-sockets\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.121953 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-metrics\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.121971 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72efa175-2568-4c62-a97e-35893887fe82-cert\") pod \"frr-k8s-webhook-server-7f989f654f-wqd8z\" (UID: \"72efa175-2568-4c62-a97e-35893887fe82\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.121995 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-reloader\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.122031 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcrdv\" (UniqueName: \"kubernetes.io/projected/bfd899d4-a0df-47e3-aa36-1cf690235c45-kube-api-access-rcrdv\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: E0309 13:36:32.122050 4764 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 09 13:36:32 crc kubenswrapper[4764]: E0309 13:36:32.122129 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9333a95c-85e4-4e7d-a142-ae2dd06b4146-metrics-certs podName:9333a95c-85e4-4e7d-a142-ae2dd06b4146 nodeName:}" failed. No retries permitted until 2026-03-09 13:36:32.62210324 +0000 UTC m=+947.872275148 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9333a95c-85e4-4e7d-a142-ae2dd06b4146-metrics-certs") pod "frr-k8s-kl47c" (UID: "9333a95c-85e4-4e7d-a142-ae2dd06b4146") : secret "frr-k8s-certs-secret" not found Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.122054 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bfd899d4-a0df-47e3-aa36-1cf690235c45-metallb-excludel2\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.122568 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9333a95c-85e4-4e7d-a142-ae2dd06b4146-frr-startup\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.122688 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df4c8\" (UniqueName: \"kubernetes.io/projected/9333a95c-85e4-4e7d-a142-ae2dd06b4146-kube-api-access-df4c8\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.122811 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-metrics-certs\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.122812 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-frr-conf\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.123109 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-frr-sockets\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.123151 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-reloader\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.123276 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-metrics\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.123810 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9333a95c-85e4-4e7d-a142-ae2dd06b4146-frr-startup\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.130037 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72efa175-2568-4c62-a97e-35893887fe82-cert\") pod \"frr-k8s-webhook-server-7f989f654f-wqd8z\" (UID: \"72efa175-2568-4c62-a97e-35893887fe82\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.142967 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df4c8\" (UniqueName: \"kubernetes.io/projected/9333a95c-85e4-4e7d-a142-ae2dd06b4146-kube-api-access-df4c8\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.156037 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfznq\" (UniqueName: \"kubernetes.io/projected/72efa175-2568-4c62-a97e-35893887fe82-kube-api-access-bfznq\") pod \"frr-k8s-webhook-server-7f989f654f-wqd8z\" (UID: \"72efa175-2568-4c62-a97e-35893887fe82\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.224508 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcrdv\" (UniqueName: \"kubernetes.io/projected/bfd899d4-a0df-47e3-aa36-1cf690235c45-kube-api-access-rcrdv\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.224595 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/709e786e-5c7d-45d3-ac38-78351dfbec81-metrics-certs\") pod \"controller-86ddb6bd46-lgrkv\" (UID: \"709e786e-5c7d-45d3-ac38-78351dfbec81\") " pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.224683 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bfd899d4-a0df-47e3-aa36-1cf690235c45-metallb-excludel2\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.224740 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ft42\" (UniqueName: \"kubernetes.io/projected/709e786e-5c7d-45d3-ac38-78351dfbec81-kube-api-access-2ft42\") pod \"controller-86ddb6bd46-lgrkv\" (UID: \"709e786e-5c7d-45d3-ac38-78351dfbec81\") " pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.224773 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-metrics-certs\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.224818 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-memberlist\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.224852 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/709e786e-5c7d-45d3-ac38-78351dfbec81-cert\") pod \"controller-86ddb6bd46-lgrkv\" (UID: \"709e786e-5c7d-45d3-ac38-78351dfbec81\") " pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: E0309 13:36:32.225036 4764 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 09 13:36:32 crc kubenswrapper[4764]: E0309 13:36:32.225097 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-metrics-certs podName:bfd899d4-a0df-47e3-aa36-1cf690235c45 nodeName:}" failed. No retries permitted until 2026-03-09 13:36:32.725076354 +0000 UTC m=+947.975248272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-metrics-certs") pod "speaker-2z5wp" (UID: "bfd899d4-a0df-47e3-aa36-1cf690235c45") : secret "speaker-certs-secret" not found Mar 09 13:36:32 crc kubenswrapper[4764]: E0309 13:36:32.225407 4764 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 09 13:36:32 crc kubenswrapper[4764]: E0309 13:36:32.225455 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-memberlist podName:bfd899d4-a0df-47e3-aa36-1cf690235c45 nodeName:}" failed. No retries permitted until 2026-03-09 13:36:32.725445213 +0000 UTC m=+947.975617141 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-memberlist") pod "speaker-2z5wp" (UID: "bfd899d4-a0df-47e3-aa36-1cf690235c45") : secret "metallb-memberlist" not found Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.225594 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bfd899d4-a0df-47e3-aa36-1cf690235c45-metallb-excludel2\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.242812 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.247251 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcrdv\" (UniqueName: \"kubernetes.io/projected/bfd899d4-a0df-47e3-aa36-1cf690235c45-kube-api-access-rcrdv\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.326541 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/709e786e-5c7d-45d3-ac38-78351dfbec81-metrics-certs\") pod \"controller-86ddb6bd46-lgrkv\" (UID: \"709e786e-5c7d-45d3-ac38-78351dfbec81\") " pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.326628 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ft42\" (UniqueName: \"kubernetes.io/projected/709e786e-5c7d-45d3-ac38-78351dfbec81-kube-api-access-2ft42\") pod \"controller-86ddb6bd46-lgrkv\" (UID: \"709e786e-5c7d-45d3-ac38-78351dfbec81\") " pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.326734 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/709e786e-5c7d-45d3-ac38-78351dfbec81-cert\") pod \"controller-86ddb6bd46-lgrkv\" (UID: \"709e786e-5c7d-45d3-ac38-78351dfbec81\") " pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.333908 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/709e786e-5c7d-45d3-ac38-78351dfbec81-metrics-certs\") pod \"controller-86ddb6bd46-lgrkv\" (UID: \"709e786e-5c7d-45d3-ac38-78351dfbec81\") " pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.334716 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/709e786e-5c7d-45d3-ac38-78351dfbec81-cert\") pod \"controller-86ddb6bd46-lgrkv\" (UID: \"709e786e-5c7d-45d3-ac38-78351dfbec81\") " pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.352222 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ft42\" (UniqueName: \"kubernetes.io/projected/709e786e-5c7d-45d3-ac38-78351dfbec81-kube-api-access-2ft42\") pod \"controller-86ddb6bd46-lgrkv\" (UID: \"709e786e-5c7d-45d3-ac38-78351dfbec81\") " pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.385950 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.488714 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z"] Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.604417 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-lgrkv"] Mar 09 13:36:32 crc kubenswrapper[4764]: W0309 13:36:32.609410 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod709e786e_5c7d_45d3_ac38_78351dfbec81.slice/crio-ccc82aaefed7d9537262982fd7a996604ba097e5041bb95579a4d031f9d2e60c WatchSource:0}: Error finding container ccc82aaefed7d9537262982fd7a996604ba097e5041bb95579a4d031f9d2e60c: Status 404 returned error can't find the container with id ccc82aaefed7d9537262982fd7a996604ba097e5041bb95579a4d031f9d2e60c Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.631879 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9333a95c-85e4-4e7d-a142-ae2dd06b4146-metrics-certs\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.635024 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9333a95c-85e4-4e7d-a142-ae2dd06b4146-metrics-certs\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.733146 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-metrics-certs\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.733830 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-memberlist\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: E0309 13:36:32.734033 4764 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 09 13:36:32 crc kubenswrapper[4764]: E0309 13:36:32.734103 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-memberlist podName:bfd899d4-a0df-47e3-aa36-1cf690235c45 nodeName:}" failed. No retries permitted until 2026-03-09 13:36:33.734078397 +0000 UTC m=+948.984250315 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-memberlist") pod "speaker-2z5wp" (UID: "bfd899d4-a0df-47e3-aa36-1cf690235c45") : secret "metallb-memberlist" not found Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.738931 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-metrics-certs\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.790801 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-lgrkv" event={"ID":"709e786e-5c7d-45d3-ac38-78351dfbec81","Type":"ContainerStarted","Data":"c645dbcc54ac7389adcf8472029f28a17bb3ab18d62f651f44a296f7c941a029"} Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.790863 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-lgrkv" event={"ID":"709e786e-5c7d-45d3-ac38-78351dfbec81","Type":"ContainerStarted","Data":"ccc82aaefed7d9537262982fd7a996604ba097e5041bb95579a4d031f9d2e60c"} Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.792624 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" event={"ID":"72efa175-2568-4c62-a97e-35893887fe82","Type":"ContainerStarted","Data":"8fbfb9c4ea8c03f5d5d51c68d861b19b4b94f6d21190cf8346cf7708bfaff7c5"} Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.835445 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:33 crc kubenswrapper[4764]: I0309 13:36:33.749576 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-memberlist\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:33 crc kubenswrapper[4764]: I0309 13:36:33.760156 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-memberlist\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:33 crc kubenswrapper[4764]: I0309 13:36:33.810378 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-lgrkv" event={"ID":"709e786e-5c7d-45d3-ac38-78351dfbec81","Type":"ContainerStarted","Data":"44e0d7fccc074ab18f2a9f368ba3458681486755c84a6061fa7b346b373675d1"} Mar 09 13:36:33 crc kubenswrapper[4764]: I0309 13:36:33.810552 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:33 crc kubenswrapper[4764]: I0309 13:36:33.812663 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kl47c" event={"ID":"9333a95c-85e4-4e7d-a142-ae2dd06b4146","Type":"ContainerStarted","Data":"a9b94eb5591c50ab81ba457deee7f7ff9c242cb15790807a3ea843d1c7e9fe45"} Mar 09 13:36:33 crc kubenswrapper[4764]: I0309 13:36:33.831013 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-lgrkv" podStartSLOduration=1.8309633619999999 podStartE2EDuration="1.830963362s" podCreationTimestamp="2026-03-09 13:36:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:36:33.826660327 +0000 UTC m=+949.076832255" watchObservedRunningTime="2026-03-09 13:36:33.830963362 +0000 UTC m=+949.081135290" Mar 09 13:36:33 crc kubenswrapper[4764]: I0309 13:36:33.853041 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-2z5wp" Mar 09 13:36:33 crc kubenswrapper[4764]: W0309 13:36:33.875755 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfd899d4_a0df_47e3_aa36_1cf690235c45.slice/crio-9b795bda24de764cb038a0f7e99fc5d97e3d1a330e3d8bf387e47f7d332912eb WatchSource:0}: Error finding container 9b795bda24de764cb038a0f7e99fc5d97e3d1a330e3d8bf387e47f7d332912eb: Status 404 returned error can't find the container with id 9b795bda24de764cb038a0f7e99fc5d97e3d1a330e3d8bf387e47f7d332912eb Mar 09 13:36:34 crc kubenswrapper[4764]: I0309 13:36:34.829781 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2z5wp" event={"ID":"bfd899d4-a0df-47e3-aa36-1cf690235c45","Type":"ContainerStarted","Data":"52a4d032a3a07513b9b75a90a480c9aa3bd91bfb06613d9f966c934e9181de0c"} Mar 09 13:36:34 crc kubenswrapper[4764]: I0309 13:36:34.830257 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2z5wp" event={"ID":"bfd899d4-a0df-47e3-aa36-1cf690235c45","Type":"ContainerStarted","Data":"6683eb40b5aec2a052fbe30366a057a1a2a74e3121f4a15d3bd3de9cec82bd1d"} Mar 09 13:36:34 crc kubenswrapper[4764]: I0309 13:36:34.830275 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2z5wp" event={"ID":"bfd899d4-a0df-47e3-aa36-1cf690235c45","Type":"ContainerStarted","Data":"9b795bda24de764cb038a0f7e99fc5d97e3d1a330e3d8bf387e47f7d332912eb"} Mar 09 13:36:34 crc kubenswrapper[4764]: I0309 13:36:34.830512 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-2z5wp" Mar 09 13:36:34 crc kubenswrapper[4764]: I0309 13:36:34.866875 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-2z5wp" podStartSLOduration=2.866849395 podStartE2EDuration="2.866849395s" podCreationTimestamp="2026-03-09 13:36:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:36:34.865717935 +0000 UTC m=+950.115889863" watchObservedRunningTime="2026-03-09 13:36:34.866849395 +0000 UTC m=+950.117021303" Mar 09 13:36:40 crc kubenswrapper[4764]: I0309 13:36:40.871471 4764 generic.go:334] "Generic (PLEG): container finished" podID="9333a95c-85e4-4e7d-a142-ae2dd06b4146" containerID="470655eecd68eac5ce56d927a5337c5a83b695a3960176753ea38b8da26f138a" exitCode=0 Mar 09 13:36:40 crc kubenswrapper[4764]: I0309 13:36:40.871543 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kl47c" event={"ID":"9333a95c-85e4-4e7d-a142-ae2dd06b4146","Type":"ContainerDied","Data":"470655eecd68eac5ce56d927a5337c5a83b695a3960176753ea38b8da26f138a"} Mar 09 13:36:40 crc kubenswrapper[4764]: I0309 13:36:40.873433 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" event={"ID":"72efa175-2568-4c62-a97e-35893887fe82","Type":"ContainerStarted","Data":"7a46c71e01e80dee907d49b4b2478287a8a1bc104aef22126e89bb77e6c8bd91"} Mar 09 13:36:40 crc kubenswrapper[4764]: I0309 13:36:40.873610 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" Mar 09 13:36:40 crc kubenswrapper[4764]: I0309 13:36:40.914271 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" podStartSLOduration=2.206412149 podStartE2EDuration="9.914248209s" podCreationTimestamp="2026-03-09 13:36:31 +0000 UTC" firstStartedPulling="2026-03-09 13:36:32.503258543 +0000 UTC m=+947.753430451" lastFinishedPulling="2026-03-09 13:36:40.211094603 +0000 UTC m=+955.461266511" observedRunningTime="2026-03-09 13:36:40.911160286 +0000 UTC m=+956.161332194" watchObservedRunningTime="2026-03-09 13:36:40.914248209 +0000 UTC m=+956.164420117" Mar 09 13:36:41 crc kubenswrapper[4764]: I0309 13:36:41.881382 4764 generic.go:334] "Generic (PLEG): container finished" podID="9333a95c-85e4-4e7d-a142-ae2dd06b4146" containerID="1f7f45e7224027f135e01339375692e4ab6c75e79fc75dad448b13ac4973a932" exitCode=0 Mar 09 13:36:41 crc kubenswrapper[4764]: I0309 13:36:41.881495 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kl47c" event={"ID":"9333a95c-85e4-4e7d-a142-ae2dd06b4146","Type":"ContainerDied","Data":"1f7f45e7224027f135e01339375692e4ab6c75e79fc75dad448b13ac4973a932"} Mar 09 13:36:42 crc kubenswrapper[4764]: I0309 13:36:42.390276 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:42 crc kubenswrapper[4764]: I0309 13:36:42.890705 4764 generic.go:334] "Generic (PLEG): container finished" podID="9333a95c-85e4-4e7d-a142-ae2dd06b4146" containerID="b307d1fe6f83787aea879f9ff45eed41f0d807cb5616664dc7891dea7e3ed6a1" exitCode=0 Mar 09 13:36:42 crc kubenswrapper[4764]: I0309 13:36:42.890950 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kl47c" event={"ID":"9333a95c-85e4-4e7d-a142-ae2dd06b4146","Type":"ContainerDied","Data":"b307d1fe6f83787aea879f9ff45eed41f0d807cb5616664dc7891dea7e3ed6a1"} Mar 09 13:36:43 crc kubenswrapper[4764]: I0309 13:36:43.902344 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kl47c" event={"ID":"9333a95c-85e4-4e7d-a142-ae2dd06b4146","Type":"ContainerStarted","Data":"c0cc25911be6ab5a281fa081a719db07070f1612173f89156a7722670d4f38ce"} Mar 09 13:36:43 crc kubenswrapper[4764]: I0309 13:36:43.903879 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:43 crc kubenswrapper[4764]: I0309 13:36:43.903899 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kl47c" event={"ID":"9333a95c-85e4-4e7d-a142-ae2dd06b4146","Type":"ContainerStarted","Data":"b875b550b788ce5f1a7ea11015e8c25e08dd7c9085f74715b0514b52c8fee9cf"} Mar 09 13:36:43 crc kubenswrapper[4764]: I0309 13:36:43.903913 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kl47c" event={"ID":"9333a95c-85e4-4e7d-a142-ae2dd06b4146","Type":"ContainerStarted","Data":"1f113a3f9f07f101fcae5bf2331396d152db4190e0d459527807589797ae1746"} Mar 09 13:36:43 crc kubenswrapper[4764]: I0309 13:36:43.903924 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kl47c" event={"ID":"9333a95c-85e4-4e7d-a142-ae2dd06b4146","Type":"ContainerStarted","Data":"2c65403ce820a236bd04146940e6cbd59f0b07a8a0e0eb9baa6635134e2b1c11"} Mar 09 13:36:43 crc kubenswrapper[4764]: I0309 13:36:43.903936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kl47c" event={"ID":"9333a95c-85e4-4e7d-a142-ae2dd06b4146","Type":"ContainerStarted","Data":"c126faa9bee79fddc335b737e5b6ffe374c670cbc4f2a2fd41d798cbdef516b5"} Mar 09 13:36:43 crc kubenswrapper[4764]: I0309 13:36:43.903946 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kl47c" event={"ID":"9333a95c-85e4-4e7d-a142-ae2dd06b4146","Type":"ContainerStarted","Data":"42cff07c48e09d3c676271f26a1cdb8a4f5bfa450eeae8871a4ecb0db96dc764"} Mar 09 13:36:43 crc kubenswrapper[4764]: I0309 13:36:43.924576 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-kl47c" podStartSLOduration=5.712443285 podStartE2EDuration="12.924557917s" podCreationTimestamp="2026-03-09 13:36:31 +0000 UTC" firstStartedPulling="2026-03-09 13:36:33.021974486 +0000 UTC m=+948.272146394" lastFinishedPulling="2026-03-09 13:36:40.234089118 +0000 UTC m=+955.484261026" observedRunningTime="2026-03-09 13:36:43.922023059 +0000 UTC m=+959.172194967" watchObservedRunningTime="2026-03-09 13:36:43.924557917 +0000 UTC m=+959.174729825" Mar 09 13:36:47 crc kubenswrapper[4764]: I0309 13:36:47.835814 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:47 crc kubenswrapper[4764]: I0309 13:36:47.875277 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:49 crc kubenswrapper[4764]: I0309 13:36:49.620885 4764 scope.go:117] "RemoveContainer" containerID="a7ac41644a3901488ef405782554d6dd08becac720d51b607ff6a4cba78e912f" Mar 09 13:36:52 crc kubenswrapper[4764]: I0309 13:36:52.248205 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" Mar 09 13:36:53 crc kubenswrapper[4764]: I0309 13:36:53.857352 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-2z5wp" Mar 09 13:36:56 crc kubenswrapper[4764]: I0309 13:36:56.420598 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-lnxvv"] Mar 09 13:36:56 crc kubenswrapper[4764]: I0309 13:36:56.422375 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lnxvv" Mar 09 13:36:56 crc kubenswrapper[4764]: I0309 13:36:56.440709 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 09 13:36:56 crc kubenswrapper[4764]: I0309 13:36:56.440718 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 09 13:36:56 crc kubenswrapper[4764]: I0309 13:36:56.440722 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-6c56h" Mar 09 13:36:56 crc kubenswrapper[4764]: I0309 13:36:56.444126 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lnxvv"] Mar 09 13:36:56 crc kubenswrapper[4764]: I0309 13:36:56.511635 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9v92\" (UniqueName: \"kubernetes.io/projected/15e66009-37fa-4f89-aba2-e39f68c46496-kube-api-access-g9v92\") pod \"openstack-operator-index-lnxvv\" (UID: \"15e66009-37fa-4f89-aba2-e39f68c46496\") " pod="openstack-operators/openstack-operator-index-lnxvv" Mar 09 13:36:56 crc kubenswrapper[4764]: I0309 13:36:56.613173 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9v92\" (UniqueName: \"kubernetes.io/projected/15e66009-37fa-4f89-aba2-e39f68c46496-kube-api-access-g9v92\") pod \"openstack-operator-index-lnxvv\" (UID: \"15e66009-37fa-4f89-aba2-e39f68c46496\") " pod="openstack-operators/openstack-operator-index-lnxvv" Mar 09 13:36:56 crc kubenswrapper[4764]: I0309 13:36:56.633014 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9v92\" (UniqueName: \"kubernetes.io/projected/15e66009-37fa-4f89-aba2-e39f68c46496-kube-api-access-g9v92\") pod \"openstack-operator-index-lnxvv\" (UID: \"15e66009-37fa-4f89-aba2-e39f68c46496\") " pod="openstack-operators/openstack-operator-index-lnxvv" Mar 09 13:36:56 crc kubenswrapper[4764]: I0309 13:36:56.741732 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lnxvv" Mar 09 13:36:57 crc kubenswrapper[4764]: I0309 13:36:57.163450 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lnxvv"] Mar 09 13:36:57 crc kubenswrapper[4764]: W0309 13:36:57.172382 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15e66009_37fa_4f89_aba2_e39f68c46496.slice/crio-357d479dcf0b97e2bac8502d01fb6d3b44d98edba00f3d6869efb668be985ee8 WatchSource:0}: Error finding container 357d479dcf0b97e2bac8502d01fb6d3b44d98edba00f3d6869efb668be985ee8: Status 404 returned error can't find the container with id 357d479dcf0b97e2bac8502d01fb6d3b44d98edba00f3d6869efb668be985ee8 Mar 09 13:36:58 crc kubenswrapper[4764]: I0309 13:36:58.000175 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lnxvv" event={"ID":"15e66009-37fa-4f89-aba2-e39f68c46496","Type":"ContainerStarted","Data":"357d479dcf0b97e2bac8502d01fb6d3b44d98edba00f3d6869efb668be985ee8"} Mar 09 13:36:58 crc kubenswrapper[4764]: I0309 13:36:58.371957 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:36:58 crc kubenswrapper[4764]: I0309 13:36:58.372091 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:36:59 crc kubenswrapper[4764]: I0309 13:36:59.802749 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-lnxvv"] Mar 09 13:37:00 crc kubenswrapper[4764]: I0309 13:37:00.427717 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-lvrg9"] Mar 09 13:37:00 crc kubenswrapper[4764]: I0309 13:37:00.428499 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lvrg9" Mar 09 13:37:00 crc kubenswrapper[4764]: I0309 13:37:00.446227 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lvrg9"] Mar 09 13:37:00 crc kubenswrapper[4764]: I0309 13:37:00.571747 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxv8l\" (UniqueName: \"kubernetes.io/projected/8e6c087a-8aaa-427c-822b-a274e19cc440-kube-api-access-gxv8l\") pod \"openstack-operator-index-lvrg9\" (UID: \"8e6c087a-8aaa-427c-822b-a274e19cc440\") " pod="openstack-operators/openstack-operator-index-lvrg9" Mar 09 13:37:00 crc kubenswrapper[4764]: I0309 13:37:00.672772 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxv8l\" (UniqueName: \"kubernetes.io/projected/8e6c087a-8aaa-427c-822b-a274e19cc440-kube-api-access-gxv8l\") pod \"openstack-operator-index-lvrg9\" (UID: \"8e6c087a-8aaa-427c-822b-a274e19cc440\") " pod="openstack-operators/openstack-operator-index-lvrg9" Mar 09 13:37:00 crc kubenswrapper[4764]: I0309 13:37:00.693737 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxv8l\" (UniqueName: \"kubernetes.io/projected/8e6c087a-8aaa-427c-822b-a274e19cc440-kube-api-access-gxv8l\") pod \"openstack-operator-index-lvrg9\" (UID: \"8e6c087a-8aaa-427c-822b-a274e19cc440\") " pod="openstack-operators/openstack-operator-index-lvrg9" Mar 09 13:37:00 crc kubenswrapper[4764]: I0309 13:37:00.813194 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lvrg9" Mar 09 13:37:01 crc kubenswrapper[4764]: I0309 13:37:01.026751 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lnxvv" event={"ID":"15e66009-37fa-4f89-aba2-e39f68c46496","Type":"ContainerStarted","Data":"1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6"} Mar 09 13:37:01 crc kubenswrapper[4764]: I0309 13:37:01.026918 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-lnxvv" podUID="15e66009-37fa-4f89-aba2-e39f68c46496" containerName="registry-server" containerID="cri-o://1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6" gracePeriod=2 Mar 09 13:37:01 crc kubenswrapper[4764]: I0309 13:37:01.053048 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-lnxvv" podStartSLOduration=1.572860859 podStartE2EDuration="5.053022653s" podCreationTimestamp="2026-03-09 13:36:56 +0000 UTC" firstStartedPulling="2026-03-09 13:36:57.174733332 +0000 UTC m=+972.424905230" lastFinishedPulling="2026-03-09 13:37:00.654895116 +0000 UTC m=+975.905067024" observedRunningTime="2026-03-09 13:37:01.045719448 +0000 UTC m=+976.295891366" watchObservedRunningTime="2026-03-09 13:37:01.053022653 +0000 UTC m=+976.303194561" Mar 09 13:37:01 crc kubenswrapper[4764]: I0309 13:37:01.252609 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lvrg9"] Mar 09 13:37:01 crc kubenswrapper[4764]: I0309 13:37:01.432473 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lnxvv" Mar 09 13:37:01 crc kubenswrapper[4764]: I0309 13:37:01.589020 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9v92\" (UniqueName: \"kubernetes.io/projected/15e66009-37fa-4f89-aba2-e39f68c46496-kube-api-access-g9v92\") pod \"15e66009-37fa-4f89-aba2-e39f68c46496\" (UID: \"15e66009-37fa-4f89-aba2-e39f68c46496\") " Mar 09 13:37:01 crc kubenswrapper[4764]: I0309 13:37:01.595928 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15e66009-37fa-4f89-aba2-e39f68c46496-kube-api-access-g9v92" (OuterVolumeSpecName: "kube-api-access-g9v92") pod "15e66009-37fa-4f89-aba2-e39f68c46496" (UID: "15e66009-37fa-4f89-aba2-e39f68c46496"). InnerVolumeSpecName "kube-api-access-g9v92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:37:01 crc kubenswrapper[4764]: I0309 13:37:01.691192 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9v92\" (UniqueName: \"kubernetes.io/projected/15e66009-37fa-4f89-aba2-e39f68c46496-kube-api-access-g9v92\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.037797 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lvrg9" event={"ID":"8e6c087a-8aaa-427c-822b-a274e19cc440","Type":"ContainerStarted","Data":"8761408c1f43723e51de7920c8e1301b9c1ef1b34821dca6d09ceaef4a9b756b"} Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.038359 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lvrg9" event={"ID":"8e6c087a-8aaa-427c-822b-a274e19cc440","Type":"ContainerStarted","Data":"38a5ed38f395b2547ef8de67cb616f1ee84cca2cb343c5104ceec7270c8f4d8a"} Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.040509 4764 generic.go:334] "Generic (PLEG): container finished" podID="15e66009-37fa-4f89-aba2-e39f68c46496" containerID="1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6" exitCode=0 Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.040548 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lnxvv" event={"ID":"15e66009-37fa-4f89-aba2-e39f68c46496","Type":"ContainerDied","Data":"1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6"} Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.040574 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lnxvv" event={"ID":"15e66009-37fa-4f89-aba2-e39f68c46496","Type":"ContainerDied","Data":"357d479dcf0b97e2bac8502d01fb6d3b44d98edba00f3d6869efb668be985ee8"} Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.040599 4764 scope.go:117] "RemoveContainer" containerID="1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6" Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.040716 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lnxvv" Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.063744 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-lvrg9" podStartSLOduration=2.01418457 podStartE2EDuration="2.063712114s" podCreationTimestamp="2026-03-09 13:37:00 +0000 UTC" firstStartedPulling="2026-03-09 13:37:01.266353739 +0000 UTC m=+976.516525647" lastFinishedPulling="2026-03-09 13:37:01.315881283 +0000 UTC m=+976.566053191" observedRunningTime="2026-03-09 13:37:02.056809789 +0000 UTC m=+977.306981697" watchObservedRunningTime="2026-03-09 13:37:02.063712114 +0000 UTC m=+977.313884032" Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.073961 4764 scope.go:117] "RemoveContainer" containerID="1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6" Mar 09 13:37:02 crc kubenswrapper[4764]: E0309 13:37:02.074462 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6\": container with ID starting with 1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6 not found: ID does not exist" containerID="1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6" Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.074501 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6"} err="failed to get container status \"1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6\": rpc error: code = NotFound desc = could not find container \"1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6\": container with ID starting with 1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6 not found: ID does not exist" Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.085792 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-lnxvv"] Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.091540 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-lnxvv"] Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.841698 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-kl47c" Mar 09 13:37:03 crc kubenswrapper[4764]: I0309 13:37:03.568192 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15e66009-37fa-4f89-aba2-e39f68c46496" path="/var/lib/kubelet/pods/15e66009-37fa-4f89-aba2-e39f68c46496/volumes" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.212883 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wbrdf"] Mar 09 13:37:10 crc kubenswrapper[4764]: E0309 13:37:10.213717 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e66009-37fa-4f89-aba2-e39f68c46496" containerName="registry-server" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.213728 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e66009-37fa-4f89-aba2-e39f68c46496" containerName="registry-server" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.213864 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="15e66009-37fa-4f89-aba2-e39f68c46496" containerName="registry-server" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.214635 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.224280 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wbrdf"] Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.326276 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5554t\" (UniqueName: \"kubernetes.io/projected/a8dfccd4-6f59-4e38-8beb-d586722f6429-kube-api-access-5554t\") pod \"community-operators-wbrdf\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.326326 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-utilities\") pod \"community-operators-wbrdf\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.326767 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-catalog-content\") pod \"community-operators-wbrdf\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.427919 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-catalog-content\") pod \"community-operators-wbrdf\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.428398 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5554t\" (UniqueName: \"kubernetes.io/projected/a8dfccd4-6f59-4e38-8beb-d586722f6429-kube-api-access-5554t\") pod \"community-operators-wbrdf\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.428440 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-utilities\") pod \"community-operators-wbrdf\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.428978 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-catalog-content\") pod \"community-operators-wbrdf\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.429094 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-utilities\") pod \"community-operators-wbrdf\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.448431 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5554t\" (UniqueName: \"kubernetes.io/projected/a8dfccd4-6f59-4e38-8beb-d586722f6429-kube-api-access-5554t\") pod \"community-operators-wbrdf\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.583760 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.813784 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-lvrg9" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.814811 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-lvrg9" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.817346 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wbrdf"] Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.872004 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-lvrg9" Mar 09 13:37:11 crc kubenswrapper[4764]: I0309 13:37:11.105252 4764 generic.go:334] "Generic (PLEG): container finished" podID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerID="65f7c38a5f862088c540cb251a678acbf56eb7ce24508fe81ee1a45ff576510e" exitCode=0 Mar 09 13:37:11 crc kubenswrapper[4764]: I0309 13:37:11.105352 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbrdf" event={"ID":"a8dfccd4-6f59-4e38-8beb-d586722f6429","Type":"ContainerDied","Data":"65f7c38a5f862088c540cb251a678acbf56eb7ce24508fe81ee1a45ff576510e"} Mar 09 13:37:11 crc kubenswrapper[4764]: I0309 13:37:11.105398 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbrdf" event={"ID":"a8dfccd4-6f59-4e38-8beb-d586722f6429","Type":"ContainerStarted","Data":"3e4da4181002b0bbc30f0995a9b31086d3309299728b9e55d210f7d242dd2b4b"} Mar 09 13:37:11 crc kubenswrapper[4764]: I0309 13:37:11.131854 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-lvrg9" Mar 09 13:37:13 crc kubenswrapper[4764]: I0309 13:37:13.120451 4764 generic.go:334] "Generic (PLEG): container finished" podID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerID="83d42e877f0fbd05f43d48b955fbdaf6a30563c45f97fa84b159a579fe3f00b5" exitCode=0 Mar 09 13:37:13 crc kubenswrapper[4764]: I0309 13:37:13.120500 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbrdf" event={"ID":"a8dfccd4-6f59-4e38-8beb-d586722f6429","Type":"ContainerDied","Data":"83d42e877f0fbd05f43d48b955fbdaf6a30563c45f97fa84b159a579fe3f00b5"} Mar 09 13:37:14 crc kubenswrapper[4764]: I0309 13:37:14.131347 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbrdf" event={"ID":"a8dfccd4-6f59-4e38-8beb-d586722f6429","Type":"ContainerStarted","Data":"9eca7f1ea0758281e2a6e18462903a1ceedc6b59b0cf6098bdfe193448c7d9e5"} Mar 09 13:37:14 crc kubenswrapper[4764]: I0309 13:37:14.150752 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wbrdf" podStartSLOduration=1.665454113 podStartE2EDuration="4.150728351s" podCreationTimestamp="2026-03-09 13:37:10 +0000 UTC" firstStartedPulling="2026-03-09 13:37:11.107279206 +0000 UTC m=+986.357451114" lastFinishedPulling="2026-03-09 13:37:13.592553444 +0000 UTC m=+988.842725352" observedRunningTime="2026-03-09 13:37:14.149013325 +0000 UTC m=+989.399185233" watchObservedRunningTime="2026-03-09 13:37:14.150728351 +0000 UTC m=+989.400900279" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.651433 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5"] Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.653670 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.656614 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-78ggt" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.662667 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5"] Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.769107 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-bundle\") pod \"bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.769370 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlzbq\" (UniqueName: \"kubernetes.io/projected/3f89e888-fc0d-48c0-ad4c-978e058ffebd-kube-api-access-mlzbq\") pod \"bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.769531 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-util\") pod \"bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.871256 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-bundle\") pod \"bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.871556 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlzbq\" (UniqueName: \"kubernetes.io/projected/3f89e888-fc0d-48c0-ad4c-978e058ffebd-kube-api-access-mlzbq\") pod \"bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.871610 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-util\") pod \"bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.872325 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-util\") pod \"bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.872314 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-bundle\") pod \"bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.904742 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlzbq\" (UniqueName: \"kubernetes.io/projected/3f89e888-fc0d-48c0-ad4c-978e058ffebd-kube-api-access-mlzbq\") pod \"bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.975828 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:19 crc kubenswrapper[4764]: I0309 13:37:19.427367 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5"] Mar 09 13:37:20 crc kubenswrapper[4764]: I0309 13:37:20.181444 4764 generic.go:334] "Generic (PLEG): container finished" podID="3f89e888-fc0d-48c0-ad4c-978e058ffebd" containerID="73b3fa69a88f215c55d983ce1eed7ce8947722da1ea80a695e4eb68985582271" exitCode=0 Mar 09 13:37:20 crc kubenswrapper[4764]: I0309 13:37:20.181524 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" event={"ID":"3f89e888-fc0d-48c0-ad4c-978e058ffebd","Type":"ContainerDied","Data":"73b3fa69a88f215c55d983ce1eed7ce8947722da1ea80a695e4eb68985582271"} Mar 09 13:37:20 crc kubenswrapper[4764]: I0309 13:37:20.181568 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" event={"ID":"3f89e888-fc0d-48c0-ad4c-978e058ffebd","Type":"ContainerStarted","Data":"3143d48deb25e1228b0b9bf2779c6a99ff527ab1f6ee5b799af9189c2a7983c8"} Mar 09 13:37:20 crc kubenswrapper[4764]: I0309 13:37:20.583915 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:20 crc kubenswrapper[4764]: I0309 13:37:20.583980 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:20 crc kubenswrapper[4764]: I0309 13:37:20.636352 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:21 crc kubenswrapper[4764]: I0309 13:37:21.193774 4764 generic.go:334] "Generic (PLEG): container finished" podID="3f89e888-fc0d-48c0-ad4c-978e058ffebd" containerID="c761b16c6f8255fa5f43bd5bfb98564a0b3ca85e3beeeee7d200204b4d9f2fef" exitCode=0 Mar 09 13:37:21 crc kubenswrapper[4764]: I0309 13:37:21.193954 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" event={"ID":"3f89e888-fc0d-48c0-ad4c-978e058ffebd","Type":"ContainerDied","Data":"c761b16c6f8255fa5f43bd5bfb98564a0b3ca85e3beeeee7d200204b4d9f2fef"} Mar 09 13:37:21 crc kubenswrapper[4764]: I0309 13:37:21.269200 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:22 crc kubenswrapper[4764]: I0309 13:37:22.206321 4764 generic.go:334] "Generic (PLEG): container finished" podID="3f89e888-fc0d-48c0-ad4c-978e058ffebd" containerID="5996dc65947ecc7b229c5bfcde0b46498cf6fc32f67fc10a16c94d68375e7654" exitCode=0 Mar 09 13:37:22 crc kubenswrapper[4764]: I0309 13:37:22.207590 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" event={"ID":"3f89e888-fc0d-48c0-ad4c-978e058ffebd","Type":"ContainerDied","Data":"5996dc65947ecc7b229c5bfcde0b46498cf6fc32f67fc10a16c94d68375e7654"} Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.531405 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.654496 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlzbq\" (UniqueName: \"kubernetes.io/projected/3f89e888-fc0d-48c0-ad4c-978e058ffebd-kube-api-access-mlzbq\") pod \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.654699 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-util\") pod \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.654769 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-bundle\") pod \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.655770 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-bundle" (OuterVolumeSpecName: "bundle") pod "3f89e888-fc0d-48c0-ad4c-978e058ffebd" (UID: "3f89e888-fc0d-48c0-ad4c-978e058ffebd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.661870 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f89e888-fc0d-48c0-ad4c-978e058ffebd-kube-api-access-mlzbq" (OuterVolumeSpecName: "kube-api-access-mlzbq") pod "3f89e888-fc0d-48c0-ad4c-978e058ffebd" (UID: "3f89e888-fc0d-48c0-ad4c-978e058ffebd"). InnerVolumeSpecName "kube-api-access-mlzbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.669906 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-util" (OuterVolumeSpecName: "util") pod "3f89e888-fc0d-48c0-ad4c-978e058ffebd" (UID: "3f89e888-fc0d-48c0-ad4c-978e058ffebd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.757313 4764 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-util\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.757361 4764 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.757374 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlzbq\" (UniqueName: \"kubernetes.io/projected/3f89e888-fc0d-48c0-ad4c-978e058ffebd-kube-api-access-mlzbq\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.799163 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wbrdf"] Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.799546 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wbrdf" podUID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerName="registry-server" containerID="cri-o://9eca7f1ea0758281e2a6e18462903a1ceedc6b59b0cf6098bdfe193448c7d9e5" gracePeriod=2 Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.223315 4764 generic.go:334] "Generic (PLEG): container finished" podID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerID="9eca7f1ea0758281e2a6e18462903a1ceedc6b59b0cf6098bdfe193448c7d9e5" exitCode=0 Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.223421 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbrdf" event={"ID":"a8dfccd4-6f59-4e38-8beb-d586722f6429","Type":"ContainerDied","Data":"9eca7f1ea0758281e2a6e18462903a1ceedc6b59b0cf6098bdfe193448c7d9e5"} Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.226799 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" event={"ID":"3f89e888-fc0d-48c0-ad4c-978e058ffebd","Type":"ContainerDied","Data":"3143d48deb25e1228b0b9bf2779c6a99ff527ab1f6ee5b799af9189c2a7983c8"} Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.226846 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3143d48deb25e1228b0b9bf2779c6a99ff527ab1f6ee5b799af9189c2a7983c8" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.226910 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.261510 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.364746 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-catalog-content\") pod \"a8dfccd4-6f59-4e38-8beb-d586722f6429\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.364842 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-utilities\") pod \"a8dfccd4-6f59-4e38-8beb-d586722f6429\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.364980 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5554t\" (UniqueName: \"kubernetes.io/projected/a8dfccd4-6f59-4e38-8beb-d586722f6429-kube-api-access-5554t\") pod \"a8dfccd4-6f59-4e38-8beb-d586722f6429\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.366009 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-utilities" (OuterVolumeSpecName: "utilities") pod "a8dfccd4-6f59-4e38-8beb-d586722f6429" (UID: "a8dfccd4-6f59-4e38-8beb-d586722f6429"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.370847 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8dfccd4-6f59-4e38-8beb-d586722f6429-kube-api-access-5554t" (OuterVolumeSpecName: "kube-api-access-5554t") pod "a8dfccd4-6f59-4e38-8beb-d586722f6429" (UID: "a8dfccd4-6f59-4e38-8beb-d586722f6429"). InnerVolumeSpecName "kube-api-access-5554t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.423458 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8dfccd4-6f59-4e38-8beb-d586722f6429" (UID: "a8dfccd4-6f59-4e38-8beb-d586722f6429"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.466673 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5554t\" (UniqueName: \"kubernetes.io/projected/a8dfccd4-6f59-4e38-8beb-d586722f6429-kube-api-access-5554t\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.466726 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.466739 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.807771 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m4qqw"] Mar 09 13:37:24 crc kubenswrapper[4764]: E0309 13:37:24.808152 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f89e888-fc0d-48c0-ad4c-978e058ffebd" containerName="pull" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.808170 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f89e888-fc0d-48c0-ad4c-978e058ffebd" containerName="pull" Mar 09 13:37:24 crc kubenswrapper[4764]: E0309 13:37:24.808182 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f89e888-fc0d-48c0-ad4c-978e058ffebd" containerName="util" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.808190 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f89e888-fc0d-48c0-ad4c-978e058ffebd" containerName="util" Mar 09 13:37:24 crc kubenswrapper[4764]: E0309 13:37:24.808199 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f89e888-fc0d-48c0-ad4c-978e058ffebd" containerName="extract" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.808206 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f89e888-fc0d-48c0-ad4c-978e058ffebd" containerName="extract" Mar 09 13:37:24 crc kubenswrapper[4764]: E0309 13:37:24.808226 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerName="extract-utilities" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.808233 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerName="extract-utilities" Mar 09 13:37:24 crc kubenswrapper[4764]: E0309 13:37:24.808241 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerName="extract-content" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.808249 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerName="extract-content" Mar 09 13:37:24 crc kubenswrapper[4764]: E0309 13:37:24.808260 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerName="registry-server" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.808266 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerName="registry-server" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.808412 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerName="registry-server" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.808426 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f89e888-fc0d-48c0-ad4c-978e058ffebd" containerName="extract" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.809399 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.848737 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4qqw"] Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.873288 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6chf\" (UniqueName: \"kubernetes.io/projected/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-kube-api-access-p6chf\") pod \"redhat-marketplace-m4qqw\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.873395 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-utilities\") pod \"redhat-marketplace-m4qqw\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.873459 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-catalog-content\") pod \"redhat-marketplace-m4qqw\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.975229 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-catalog-content\") pod \"redhat-marketplace-m4qqw\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.975344 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6chf\" (UniqueName: \"kubernetes.io/projected/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-kube-api-access-p6chf\") pod \"redhat-marketplace-m4qqw\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.975420 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-utilities\") pod \"redhat-marketplace-m4qqw\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.976139 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-utilities\") pod \"redhat-marketplace-m4qqw\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.976228 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-catalog-content\") pod \"redhat-marketplace-m4qqw\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.994033 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6chf\" (UniqueName: \"kubernetes.io/projected/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-kube-api-access-p6chf\") pod \"redhat-marketplace-m4qqw\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:25 crc kubenswrapper[4764]: I0309 13:37:25.127053 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:25 crc kubenswrapper[4764]: I0309 13:37:25.239005 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbrdf" event={"ID":"a8dfccd4-6f59-4e38-8beb-d586722f6429","Type":"ContainerDied","Data":"3e4da4181002b0bbc30f0995a9b31086d3309299728b9e55d210f7d242dd2b4b"} Mar 09 13:37:25 crc kubenswrapper[4764]: I0309 13:37:25.239361 4764 scope.go:117] "RemoveContainer" containerID="9eca7f1ea0758281e2a6e18462903a1ceedc6b59b0cf6098bdfe193448c7d9e5" Mar 09 13:37:25 crc kubenswrapper[4764]: I0309 13:37:25.239515 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:25 crc kubenswrapper[4764]: I0309 13:37:25.278772 4764 scope.go:117] "RemoveContainer" containerID="83d42e877f0fbd05f43d48b955fbdaf6a30563c45f97fa84b159a579fe3f00b5" Mar 09 13:37:25 crc kubenswrapper[4764]: I0309 13:37:25.281238 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wbrdf"] Mar 09 13:37:25 crc kubenswrapper[4764]: I0309 13:37:25.286427 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wbrdf"] Mar 09 13:37:25 crc kubenswrapper[4764]: I0309 13:37:25.306978 4764 scope.go:117] "RemoveContainer" containerID="65f7c38a5f862088c540cb251a678acbf56eb7ce24508fe81ee1a45ff576510e" Mar 09 13:37:25 crc kubenswrapper[4764]: I0309 13:37:25.570819 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8dfccd4-6f59-4e38-8beb-d586722f6429" path="/var/lib/kubelet/pods/a8dfccd4-6f59-4e38-8beb-d586722f6429/volumes" Mar 09 13:37:25 crc kubenswrapper[4764]: I0309 13:37:25.611676 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4qqw"] Mar 09 13:37:25 crc kubenswrapper[4764]: W0309 13:37:25.622920 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36d190ba_cb10_4d0a_a5f2_b87befbf6f87.slice/crio-7755051cd41c2e60be705b844644aeaa882858e41de06e49db1ac8b1d1793e1e WatchSource:0}: Error finding container 7755051cd41c2e60be705b844644aeaa882858e41de06e49db1ac8b1d1793e1e: Status 404 returned error can't find the container with id 7755051cd41c2e60be705b844644aeaa882858e41de06e49db1ac8b1d1793e1e Mar 09 13:37:26 crc kubenswrapper[4764]: I0309 13:37:26.247528 4764 generic.go:334] "Generic (PLEG): container finished" podID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerID="476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3" exitCode=0 Mar 09 13:37:26 crc kubenswrapper[4764]: I0309 13:37:26.247597 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4qqw" event={"ID":"36d190ba-cb10-4d0a-a5f2-b87befbf6f87","Type":"ContainerDied","Data":"476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3"} Mar 09 13:37:26 crc kubenswrapper[4764]: I0309 13:37:26.248016 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4qqw" event={"ID":"36d190ba-cb10-4d0a-a5f2-b87befbf6f87","Type":"ContainerStarted","Data":"7755051cd41c2e60be705b844644aeaa882858e41de06e49db1ac8b1d1793e1e"} Mar 09 13:37:27 crc kubenswrapper[4764]: I0309 13:37:27.260463 4764 generic.go:334] "Generic (PLEG): container finished" podID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerID="1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab" exitCode=0 Mar 09 13:37:27 crc kubenswrapper[4764]: I0309 13:37:27.260528 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4qqw" event={"ID":"36d190ba-cb10-4d0a-a5f2-b87befbf6f87","Type":"ContainerDied","Data":"1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab"} Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.269404 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4qqw" event={"ID":"36d190ba-cb10-4d0a-a5f2-b87befbf6f87","Type":"ContainerStarted","Data":"268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4"} Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.292161 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn"] Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.293495 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn" Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.295623 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-44sd7" Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.307063 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m4qqw" podStartSLOduration=2.8889430369999998 podStartE2EDuration="4.307042293s" podCreationTimestamp="2026-03-09 13:37:24 +0000 UTC" firstStartedPulling="2026-03-09 13:37:26.249890117 +0000 UTC m=+1001.500062025" lastFinishedPulling="2026-03-09 13:37:27.667989373 +0000 UTC m=+1002.918161281" observedRunningTime="2026-03-09 13:37:28.304328451 +0000 UTC m=+1003.554500369" watchObservedRunningTime="2026-03-09 13:37:28.307042293 +0000 UTC m=+1003.557214201" Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.328739 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn"] Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.370030 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.370092 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.429393 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zfsv\" (UniqueName: \"kubernetes.io/projected/67c57635-59f1-48a2-9823-c86732eabbf6-kube-api-access-4zfsv\") pod \"openstack-operator-controller-init-6754b7f846-ns9zn\" (UID: \"67c57635-59f1-48a2-9823-c86732eabbf6\") " pod="openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn" Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.531003 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zfsv\" (UniqueName: \"kubernetes.io/projected/67c57635-59f1-48a2-9823-c86732eabbf6-kube-api-access-4zfsv\") pod \"openstack-operator-controller-init-6754b7f846-ns9zn\" (UID: \"67c57635-59f1-48a2-9823-c86732eabbf6\") " pod="openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn" Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.562709 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zfsv\" (UniqueName: \"kubernetes.io/projected/67c57635-59f1-48a2-9823-c86732eabbf6-kube-api-access-4zfsv\") pod \"openstack-operator-controller-init-6754b7f846-ns9zn\" (UID: \"67c57635-59f1-48a2-9823-c86732eabbf6\") " pod="openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn" Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.609868 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn" Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.892349 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn"] Mar 09 13:37:28 crc kubenswrapper[4764]: W0309 13:37:28.899893 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67c57635_59f1_48a2_9823_c86732eabbf6.slice/crio-50ed490116b2fe4bc00db782eb11ee4ed848d95032478408fb7b3ab412ea9e99 WatchSource:0}: Error finding container 50ed490116b2fe4bc00db782eb11ee4ed848d95032478408fb7b3ab412ea9e99: Status 404 returned error can't find the container with id 50ed490116b2fe4bc00db782eb11ee4ed848d95032478408fb7b3ab412ea9e99 Mar 09 13:37:29 crc kubenswrapper[4764]: I0309 13:37:29.280575 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn" event={"ID":"67c57635-59f1-48a2-9823-c86732eabbf6","Type":"ContainerStarted","Data":"50ed490116b2fe4bc00db782eb11ee4ed848d95032478408fb7b3ab412ea9e99"} Mar 09 13:37:35 crc kubenswrapper[4764]: I0309 13:37:35.128227 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:35 crc kubenswrapper[4764]: I0309 13:37:35.129044 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:35 crc kubenswrapper[4764]: I0309 13:37:35.181620 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:35 crc kubenswrapper[4764]: I0309 13:37:35.367037 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn" event={"ID":"67c57635-59f1-48a2-9823-c86732eabbf6","Type":"ContainerStarted","Data":"c862bc3fd9e6268402af1f5ce9425cb23adf6320f25fe03c8a2fa25c24c088d2"} Mar 09 13:37:35 crc kubenswrapper[4764]: I0309 13:37:35.367357 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn" Mar 09 13:37:35 crc kubenswrapper[4764]: I0309 13:37:35.398568 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn" podStartSLOduration=1.9172723280000001 podStartE2EDuration="7.39854516s" podCreationTimestamp="2026-03-09 13:37:28 +0000 UTC" firstStartedPulling="2026-03-09 13:37:28.90179087 +0000 UTC m=+1004.151962778" lastFinishedPulling="2026-03-09 13:37:34.383063702 +0000 UTC m=+1009.633235610" observedRunningTime="2026-03-09 13:37:35.397555343 +0000 UTC m=+1010.647727261" watchObservedRunningTime="2026-03-09 13:37:35.39854516 +0000 UTC m=+1010.648717068" Mar 09 13:37:35 crc kubenswrapper[4764]: I0309 13:37:35.423984 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:35 crc kubenswrapper[4764]: I0309 13:37:35.998504 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4qqw"] Mar 09 13:37:37 crc kubenswrapper[4764]: I0309 13:37:37.380075 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m4qqw" podUID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerName="registry-server" containerID="cri-o://268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4" gracePeriod=2 Mar 09 13:37:37 crc kubenswrapper[4764]: E0309 13:37:37.503555 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36d190ba_cb10_4d0a_a5f2_b87befbf6f87.slice/crio-conmon-268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:37:37 crc kubenswrapper[4764]: I0309 13:37:37.775606 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:37 crc kubenswrapper[4764]: I0309 13:37:37.942750 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-catalog-content\") pod \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " Mar 09 13:37:37 crc kubenswrapper[4764]: I0309 13:37:37.943166 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6chf\" (UniqueName: \"kubernetes.io/projected/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-kube-api-access-p6chf\") pod \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " Mar 09 13:37:37 crc kubenswrapper[4764]: I0309 13:37:37.943340 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-utilities\") pod \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " Mar 09 13:37:37 crc kubenswrapper[4764]: I0309 13:37:37.944990 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-utilities" (OuterVolumeSpecName: "utilities") pod "36d190ba-cb10-4d0a-a5f2-b87befbf6f87" (UID: "36d190ba-cb10-4d0a-a5f2-b87befbf6f87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:37:37 crc kubenswrapper[4764]: I0309 13:37:37.951271 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-kube-api-access-p6chf" (OuterVolumeSpecName: "kube-api-access-p6chf") pod "36d190ba-cb10-4d0a-a5f2-b87befbf6f87" (UID: "36d190ba-cb10-4d0a-a5f2-b87befbf6f87"). InnerVolumeSpecName "kube-api-access-p6chf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:37:37 crc kubenswrapper[4764]: I0309 13:37:37.987265 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36d190ba-cb10-4d0a-a5f2-b87befbf6f87" (UID: "36d190ba-cb10-4d0a-a5f2-b87befbf6f87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.046497 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.046583 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.046606 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6chf\" (UniqueName: \"kubernetes.io/projected/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-kube-api-access-p6chf\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.390216 4764 generic.go:334] "Generic (PLEG): container finished" podID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerID="268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4" exitCode=0 Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.390295 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4qqw" event={"ID":"36d190ba-cb10-4d0a-a5f2-b87befbf6f87","Type":"ContainerDied","Data":"268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4"} Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.390352 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4qqw" event={"ID":"36d190ba-cb10-4d0a-a5f2-b87befbf6f87","Type":"ContainerDied","Data":"7755051cd41c2e60be705b844644aeaa882858e41de06e49db1ac8b1d1793e1e"} Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.390380 4764 scope.go:117] "RemoveContainer" containerID="268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.390299 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.421541 4764 scope.go:117] "RemoveContainer" containerID="1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.426380 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4qqw"] Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.432788 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4qqw"] Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.443172 4764 scope.go:117] "RemoveContainer" containerID="476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.461965 4764 scope.go:117] "RemoveContainer" containerID="268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4" Mar 09 13:37:38 crc kubenswrapper[4764]: E0309 13:37:38.462441 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4\": container with ID starting with 268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4 not found: ID does not exist" containerID="268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.462476 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4"} err="failed to get container status \"268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4\": rpc error: code = NotFound desc = could not find container \"268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4\": container with ID starting with 268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4 not found: ID does not exist" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.462496 4764 scope.go:117] "RemoveContainer" containerID="1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab" Mar 09 13:37:38 crc kubenswrapper[4764]: E0309 13:37:38.462791 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab\": container with ID starting with 1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab not found: ID does not exist" containerID="1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.462813 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab"} err="failed to get container status \"1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab\": rpc error: code = NotFound desc = could not find container \"1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab\": container with ID starting with 1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab not found: ID does not exist" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.462825 4764 scope.go:117] "RemoveContainer" containerID="476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3" Mar 09 13:37:38 crc kubenswrapper[4764]: E0309 13:37:38.463064 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3\": container with ID starting with 476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3 not found: ID does not exist" containerID="476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.463086 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3"} err="failed to get container status \"476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3\": rpc error: code = NotFound desc = could not find container \"476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3\": container with ID starting with 476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3 not found: ID does not exist" Mar 09 13:37:39 crc kubenswrapper[4764]: I0309 13:37:39.570815 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" path="/var/lib/kubelet/pods/36d190ba-cb10-4d0a-a5f2-b87befbf6f87/volumes" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.412803 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jfqln"] Mar 09 13:37:42 crc kubenswrapper[4764]: E0309 13:37:42.413480 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerName="extract-utilities" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.413499 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerName="extract-utilities" Mar 09 13:37:42 crc kubenswrapper[4764]: E0309 13:37:42.413516 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerName="extract-content" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.413523 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerName="extract-content" Mar 09 13:37:42 crc kubenswrapper[4764]: E0309 13:37:42.413548 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerName="registry-server" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.413558 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerName="registry-server" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.413768 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerName="registry-server" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.414878 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.436758 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jfqln"] Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.508785 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-catalog-content\") pod \"certified-operators-jfqln\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.508834 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-utilities\") pod \"certified-operators-jfqln\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.508863 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gct72\" (UniqueName: \"kubernetes.io/projected/a77dd9b9-647b-4a75-b754-d7c92507e241-kube-api-access-gct72\") pod \"certified-operators-jfqln\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.611212 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-catalog-content\") pod \"certified-operators-jfqln\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.611706 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-utilities\") pod \"certified-operators-jfqln\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.611830 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-catalog-content\") pod \"certified-operators-jfqln\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.612191 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-utilities\") pod \"certified-operators-jfqln\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.612406 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gct72\" (UniqueName: \"kubernetes.io/projected/a77dd9b9-647b-4a75-b754-d7c92507e241-kube-api-access-gct72\") pod \"certified-operators-jfqln\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.639068 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gct72\" (UniqueName: \"kubernetes.io/projected/a77dd9b9-647b-4a75-b754-d7c92507e241-kube-api-access-gct72\") pod \"certified-operators-jfqln\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.740022 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:43 crc kubenswrapper[4764]: I0309 13:37:43.202139 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jfqln"] Mar 09 13:37:43 crc kubenswrapper[4764]: I0309 13:37:43.446632 4764 generic.go:334] "Generic (PLEG): container finished" podID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerID="9cc110aa0cbe4a810cdc27fc5d3549dd4a1ef08a16d0a88aba121e7def0205ec" exitCode=0 Mar 09 13:37:43 crc kubenswrapper[4764]: I0309 13:37:43.446699 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfqln" event={"ID":"a77dd9b9-647b-4a75-b754-d7c92507e241","Type":"ContainerDied","Data":"9cc110aa0cbe4a810cdc27fc5d3549dd4a1ef08a16d0a88aba121e7def0205ec"} Mar 09 13:37:43 crc kubenswrapper[4764]: I0309 13:37:43.446731 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfqln" event={"ID":"a77dd9b9-647b-4a75-b754-d7c92507e241","Type":"ContainerStarted","Data":"fea157f4e9abb709a49af60bce7d1a5f75e60dccdc7d1d8642fcea7367aa5768"} Mar 09 13:37:44 crc kubenswrapper[4764]: I0309 13:37:44.455463 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfqln" event={"ID":"a77dd9b9-647b-4a75-b754-d7c92507e241","Type":"ContainerStarted","Data":"b5fc661ae561d0866cfa65888b69c005d4e29c4b626a7af3e20fd0370ac554ca"} Mar 09 13:37:45 crc kubenswrapper[4764]: I0309 13:37:45.463617 4764 generic.go:334] "Generic (PLEG): container finished" podID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerID="b5fc661ae561d0866cfa65888b69c005d4e29c4b626a7af3e20fd0370ac554ca" exitCode=0 Mar 09 13:37:45 crc kubenswrapper[4764]: I0309 13:37:45.463697 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfqln" event={"ID":"a77dd9b9-647b-4a75-b754-d7c92507e241","Type":"ContainerDied","Data":"b5fc661ae561d0866cfa65888b69c005d4e29c4b626a7af3e20fd0370ac554ca"} Mar 09 13:37:47 crc kubenswrapper[4764]: I0309 13:37:47.480855 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfqln" event={"ID":"a77dd9b9-647b-4a75-b754-d7c92507e241","Type":"ContainerStarted","Data":"c6becf3bd2bee1ada31af980926b44680472b5286412c9246d30f2b373c59019"} Mar 09 13:37:47 crc kubenswrapper[4764]: I0309 13:37:47.502713 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jfqln" podStartSLOduration=2.000163354 podStartE2EDuration="5.502626864s" podCreationTimestamp="2026-03-09 13:37:42 +0000 UTC" firstStartedPulling="2026-03-09 13:37:43.448577372 +0000 UTC m=+1018.698749280" lastFinishedPulling="2026-03-09 13:37:46.951040882 +0000 UTC m=+1022.201212790" observedRunningTime="2026-03-09 13:37:47.498514604 +0000 UTC m=+1022.748686512" watchObservedRunningTime="2026-03-09 13:37:47.502626864 +0000 UTC m=+1022.752798772" Mar 09 13:37:48 crc kubenswrapper[4764]: I0309 13:37:48.617073 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn" Mar 09 13:37:52 crc kubenswrapper[4764]: I0309 13:37:52.740408 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:52 crc kubenswrapper[4764]: I0309 13:37:52.740810 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:52 crc kubenswrapper[4764]: I0309 13:37:52.807189 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:53 crc kubenswrapper[4764]: I0309 13:37:53.568265 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:53 crc kubenswrapper[4764]: I0309 13:37:53.612839 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jfqln"] Mar 09 13:37:55 crc kubenswrapper[4764]: I0309 13:37:55.549397 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jfqln" podUID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerName="registry-server" containerID="cri-o://c6becf3bd2bee1ada31af980926b44680472b5286412c9246d30f2b373c59019" gracePeriod=2 Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.567766 4764 generic.go:334] "Generic (PLEG): container finished" podID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerID="c6becf3bd2bee1ada31af980926b44680472b5286412c9246d30f2b373c59019" exitCode=0 Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.567838 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfqln" event={"ID":"a77dd9b9-647b-4a75-b754-d7c92507e241","Type":"ContainerDied","Data":"c6becf3bd2bee1ada31af980926b44680472b5286412c9246d30f2b373c59019"} Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.696113 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.765892 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-catalog-content\") pod \"a77dd9b9-647b-4a75-b754-d7c92507e241\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.765951 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gct72\" (UniqueName: \"kubernetes.io/projected/a77dd9b9-647b-4a75-b754-d7c92507e241-kube-api-access-gct72\") pod \"a77dd9b9-647b-4a75-b754-d7c92507e241\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.766062 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-utilities\") pod \"a77dd9b9-647b-4a75-b754-d7c92507e241\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.767069 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-utilities" (OuterVolumeSpecName: "utilities") pod "a77dd9b9-647b-4a75-b754-d7c92507e241" (UID: "a77dd9b9-647b-4a75-b754-d7c92507e241"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.774664 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a77dd9b9-647b-4a75-b754-d7c92507e241-kube-api-access-gct72" (OuterVolumeSpecName: "kube-api-access-gct72") pod "a77dd9b9-647b-4a75-b754-d7c92507e241" (UID: "a77dd9b9-647b-4a75-b754-d7c92507e241"). InnerVolumeSpecName "kube-api-access-gct72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.834236 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a77dd9b9-647b-4a75-b754-d7c92507e241" (UID: "a77dd9b9-647b-4a75-b754-d7c92507e241"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.868375 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.868413 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gct72\" (UniqueName: \"kubernetes.io/projected/a77dd9b9-647b-4a75-b754-d7c92507e241-kube-api-access-gct72\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.868428 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:57 crc kubenswrapper[4764]: I0309 13:37:57.577685 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfqln" event={"ID":"a77dd9b9-647b-4a75-b754-d7c92507e241","Type":"ContainerDied","Data":"fea157f4e9abb709a49af60bce7d1a5f75e60dccdc7d1d8642fcea7367aa5768"} Mar 09 13:37:57 crc kubenswrapper[4764]: I0309 13:37:57.577740 4764 scope.go:117] "RemoveContainer" containerID="c6becf3bd2bee1ada31af980926b44680472b5286412c9246d30f2b373c59019" Mar 09 13:37:57 crc kubenswrapper[4764]: I0309 13:37:57.577760 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:57 crc kubenswrapper[4764]: I0309 13:37:57.599064 4764 scope.go:117] "RemoveContainer" containerID="b5fc661ae561d0866cfa65888b69c005d4e29c4b626a7af3e20fd0370ac554ca" Mar 09 13:37:57 crc kubenswrapper[4764]: I0309 13:37:57.618291 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jfqln"] Mar 09 13:37:57 crc kubenswrapper[4764]: I0309 13:37:57.631436 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jfqln"] Mar 09 13:37:57 crc kubenswrapper[4764]: I0309 13:37:57.631554 4764 scope.go:117] "RemoveContainer" containerID="9cc110aa0cbe4a810cdc27fc5d3549dd4a1ef08a16d0a88aba121e7def0205ec" Mar 09 13:37:58 crc kubenswrapper[4764]: I0309 13:37:58.370753 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:37:58 crc kubenswrapper[4764]: I0309 13:37:58.371050 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:37:58 crc kubenswrapper[4764]: I0309 13:37:58.371095 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:37:58 crc kubenswrapper[4764]: I0309 13:37:58.371763 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bd218b04a62035d950083134e237631295493daecba2baa1eef096560f891819"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:37:58 crc kubenswrapper[4764]: I0309 13:37:58.371818 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://bd218b04a62035d950083134e237631295493daecba2baa1eef096560f891819" gracePeriod=600 Mar 09 13:37:58 crc kubenswrapper[4764]: I0309 13:37:58.604247 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="bd218b04a62035d950083134e237631295493daecba2baa1eef096560f891819" exitCode=0 Mar 09 13:37:58 crc kubenswrapper[4764]: I0309 13:37:58.604363 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"bd218b04a62035d950083134e237631295493daecba2baa1eef096560f891819"} Mar 09 13:37:58 crc kubenswrapper[4764]: I0309 13:37:58.604413 4764 scope.go:117] "RemoveContainer" containerID="3e004d46e664a823c675c177e537cdd2b21dcfc6ff9afcb1b390da1939340817" Mar 09 13:37:59 crc kubenswrapper[4764]: I0309 13:37:59.569990 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a77dd9b9-647b-4a75-b754-d7c92507e241" path="/var/lib/kubelet/pods/a77dd9b9-647b-4a75-b754-d7c92507e241/volumes" Mar 09 13:37:59 crc kubenswrapper[4764]: I0309 13:37:59.620383 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"089fede4f6de3963d0297d0b784543ed7a9d38cdefe66f6bbb9aab989dbffbb0"} Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.144754 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551058-6mlbf"] Mar 09 13:38:00 crc kubenswrapper[4764]: E0309 13:38:00.145408 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerName="extract-utilities" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.145430 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerName="extract-utilities" Mar 09 13:38:00 crc kubenswrapper[4764]: E0309 13:38:00.145448 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerName="extract-content" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.145455 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerName="extract-content" Mar 09 13:38:00 crc kubenswrapper[4764]: E0309 13:38:00.145471 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerName="registry-server" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.145479 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerName="registry-server" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.145618 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerName="registry-server" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.146176 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551058-6mlbf" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.147987 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.148565 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.150966 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.152923 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551058-6mlbf"] Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.213702 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mps4k\" (UniqueName: \"kubernetes.io/projected/175910d6-eb27-4000-ac8b-9ea49f05bb8b-kube-api-access-mps4k\") pod \"auto-csr-approver-29551058-6mlbf\" (UID: \"175910d6-eb27-4000-ac8b-9ea49f05bb8b\") " pod="openshift-infra/auto-csr-approver-29551058-6mlbf" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.315505 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mps4k\" (UniqueName: \"kubernetes.io/projected/175910d6-eb27-4000-ac8b-9ea49f05bb8b-kube-api-access-mps4k\") pod \"auto-csr-approver-29551058-6mlbf\" (UID: \"175910d6-eb27-4000-ac8b-9ea49f05bb8b\") " pod="openshift-infra/auto-csr-approver-29551058-6mlbf" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.338670 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mps4k\" (UniqueName: \"kubernetes.io/projected/175910d6-eb27-4000-ac8b-9ea49f05bb8b-kube-api-access-mps4k\") pod \"auto-csr-approver-29551058-6mlbf\" (UID: \"175910d6-eb27-4000-ac8b-9ea49f05bb8b\") " pod="openshift-infra/auto-csr-approver-29551058-6mlbf" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.496537 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551058-6mlbf" Mar 09 13:38:01 crc kubenswrapper[4764]: I0309 13:38:01.006329 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551058-6mlbf"] Mar 09 13:38:01 crc kubenswrapper[4764]: I0309 13:38:01.636588 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551058-6mlbf" event={"ID":"175910d6-eb27-4000-ac8b-9ea49f05bb8b","Type":"ContainerStarted","Data":"ce5182a953f7d877c80b916ad134db136116956777eedbeb4d0f7a167e307e06"} Mar 09 13:38:02 crc kubenswrapper[4764]: I0309 13:38:02.646937 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551058-6mlbf" event={"ID":"175910d6-eb27-4000-ac8b-9ea49f05bb8b","Type":"ContainerStarted","Data":"c0f759ebfc59d520d002517970a3889749936b75779f65069038f6e34fd87723"} Mar 09 13:38:02 crc kubenswrapper[4764]: I0309 13:38:02.691935 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551058-6mlbf" podStartSLOduration=1.792670766 podStartE2EDuration="2.691911067s" podCreationTimestamp="2026-03-09 13:38:00 +0000 UTC" firstStartedPulling="2026-03-09 13:38:01.019217915 +0000 UTC m=+1036.269389823" lastFinishedPulling="2026-03-09 13:38:01.918458216 +0000 UTC m=+1037.168630124" observedRunningTime="2026-03-09 13:38:02.689485514 +0000 UTC m=+1037.939657422" watchObservedRunningTime="2026-03-09 13:38:02.691911067 +0000 UTC m=+1037.942082985" Mar 09 13:38:03 crc kubenswrapper[4764]: I0309 13:38:03.653864 4764 generic.go:334] "Generic (PLEG): container finished" podID="175910d6-eb27-4000-ac8b-9ea49f05bb8b" containerID="c0f759ebfc59d520d002517970a3889749936b75779f65069038f6e34fd87723" exitCode=0 Mar 09 13:38:03 crc kubenswrapper[4764]: I0309 13:38:03.654053 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551058-6mlbf" event={"ID":"175910d6-eb27-4000-ac8b-9ea49f05bb8b","Type":"ContainerDied","Data":"c0f759ebfc59d520d002517970a3889749936b75779f65069038f6e34fd87723"} Mar 09 13:38:05 crc kubenswrapper[4764]: I0309 13:38:05.033331 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551058-6mlbf" Mar 09 13:38:05 crc kubenswrapper[4764]: I0309 13:38:05.191616 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mps4k\" (UniqueName: \"kubernetes.io/projected/175910d6-eb27-4000-ac8b-9ea49f05bb8b-kube-api-access-mps4k\") pod \"175910d6-eb27-4000-ac8b-9ea49f05bb8b\" (UID: \"175910d6-eb27-4000-ac8b-9ea49f05bb8b\") " Mar 09 13:38:05 crc kubenswrapper[4764]: I0309 13:38:05.201428 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/175910d6-eb27-4000-ac8b-9ea49f05bb8b-kube-api-access-mps4k" (OuterVolumeSpecName: "kube-api-access-mps4k") pod "175910d6-eb27-4000-ac8b-9ea49f05bb8b" (UID: "175910d6-eb27-4000-ac8b-9ea49f05bb8b"). InnerVolumeSpecName "kube-api-access-mps4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:38:05 crc kubenswrapper[4764]: I0309 13:38:05.293822 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mps4k\" (UniqueName: \"kubernetes.io/projected/175910d6-eb27-4000-ac8b-9ea49f05bb8b-kube-api-access-mps4k\") on node \"crc\" DevicePath \"\"" Mar 09 13:38:05 crc kubenswrapper[4764]: I0309 13:38:05.668383 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551058-6mlbf" event={"ID":"175910d6-eb27-4000-ac8b-9ea49f05bb8b","Type":"ContainerDied","Data":"ce5182a953f7d877c80b916ad134db136116956777eedbeb4d0f7a167e307e06"} Mar 09 13:38:05 crc kubenswrapper[4764]: I0309 13:38:05.668862 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce5182a953f7d877c80b916ad134db136116956777eedbeb4d0f7a167e307e06" Mar 09 13:38:05 crc kubenswrapper[4764]: I0309 13:38:05.668462 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551058-6mlbf" Mar 09 13:38:06 crc kubenswrapper[4764]: I0309 13:38:06.091717 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551052-t652n"] Mar 09 13:38:06 crc kubenswrapper[4764]: I0309 13:38:06.096667 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551052-t652n"] Mar 09 13:38:07 crc kubenswrapper[4764]: I0309 13:38:07.566970 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee50d407-01a6-43e7-833e-b803dbb4792f" path="/var/lib/kubelet/pods/ee50d407-01a6-43e7-833e-b803dbb4792f/volumes" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.649604 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8"] Mar 09 13:38:08 crc kubenswrapper[4764]: E0309 13:38:08.650265 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="175910d6-eb27-4000-ac8b-9ea49f05bb8b" containerName="oc" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.650281 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="175910d6-eb27-4000-ac8b-9ea49f05bb8b" containerName="oc" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.650427 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="175910d6-eb27-4000-ac8b-9ea49f05bb8b" containerName="oc" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.651026 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.654794 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-cdpv6" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.656924 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.657922 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.665300 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-5dvsf" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.673878 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.691532 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.708693 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.709525 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.717337 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-s8xv6" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.737848 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.742189 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znvfv\" (UniqueName: \"kubernetes.io/projected/e220a3f1-4dbe-4ee6-9b19-26985fa998cf-kube-api-access-znvfv\") pod \"barbican-operator-controller-manager-6db6876945-82cg8\" (UID: \"e220a3f1-4dbe-4ee6-9b19-26985fa998cf\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.742233 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnz4t\" (UniqueName: \"kubernetes.io/projected/4c271ca0-0c25-46d1-b730-e94f68397e29-kube-api-access-lnz4t\") pod \"cinder-operator-controller-manager-55d77d7b5c-nppjq\" (UID: \"4c271ca0-0c25-46d1-b730-e94f68397e29\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.772257 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.773341 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.780055 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-924l7" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.797420 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.826695 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.827480 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.837916 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-7cd2b" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.845411 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95kb4\" (UniqueName: \"kubernetes.io/projected/488ff419-d889-4778-96cf-a11006c49507-kube-api-access-95kb4\") pod \"glance-operator-controller-manager-64db6967f8-mjf6m\" (UID: \"488ff419-d889-4778-96cf-a11006c49507\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.845517 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znvfv\" (UniqueName: \"kubernetes.io/projected/e220a3f1-4dbe-4ee6-9b19-26985fa998cf-kube-api-access-znvfv\") pod \"barbican-operator-controller-manager-6db6876945-82cg8\" (UID: \"e220a3f1-4dbe-4ee6-9b19-26985fa998cf\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.845550 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnz4t\" (UniqueName: \"kubernetes.io/projected/4c271ca0-0c25-46d1-b730-e94f68397e29-kube-api-access-lnz4t\") pod \"cinder-operator-controller-manager-55d77d7b5c-nppjq\" (UID: \"4c271ca0-0c25-46d1-b730-e94f68397e29\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.845627 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkgj9\" (UniqueName: \"kubernetes.io/projected/725c0dd0-07d1-4a1c-b223-e8bec76cc7ff-kube-api-access-lkgj9\") pod \"designate-operator-controller-manager-5d87c9d997-cmtpc\" (UID: \"725c0dd0-07d1-4a1c-b223-e8bec76cc7ff\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.860205 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.861134 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.867188 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-t6qtj" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.872733 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.893633 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnz4t\" (UniqueName: \"kubernetes.io/projected/4c271ca0-0c25-46d1-b730-e94f68397e29-kube-api-access-lnz4t\") pod \"cinder-operator-controller-manager-55d77d7b5c-nppjq\" (UID: \"4c271ca0-0c25-46d1-b730-e94f68397e29\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.894253 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znvfv\" (UniqueName: \"kubernetes.io/projected/e220a3f1-4dbe-4ee6-9b19-26985fa998cf-kube-api-access-znvfv\") pod \"barbican-operator-controller-manager-6db6876945-82cg8\" (UID: \"e220a3f1-4dbe-4ee6-9b19-26985fa998cf\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.901165 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.907763 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.919898 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.920072 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.925037 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-j6rd9" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.927488 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.938864 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.939201 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-s9gg2" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.946606 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkvqv\" (UniqueName: \"kubernetes.io/projected/3da43711-be34-4189-b686-e8e9bc9e7265-kube-api-access-dkvqv\") pod \"horizon-operator-controller-manager-78bc7f9bd9-5xc2s\" (UID: \"3da43711-be34-4189-b686-e8e9bc9e7265\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.946765 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkgj9\" (UniqueName: \"kubernetes.io/projected/725c0dd0-07d1-4a1c-b223-e8bec76cc7ff-kube-api-access-lkgj9\") pod \"designate-operator-controller-manager-5d87c9d997-cmtpc\" (UID: \"725c0dd0-07d1-4a1c-b223-e8bec76cc7ff\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.946793 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95kb4\" (UniqueName: \"kubernetes.io/projected/488ff419-d889-4778-96cf-a11006c49507-kube-api-access-95kb4\") pod \"glance-operator-controller-manager-64db6967f8-mjf6m\" (UID: \"488ff419-d889-4778-96cf-a11006c49507\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.946825 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cqd9\" (UniqueName: \"kubernetes.io/projected/7295db10-1c36-4c17-bf1e-4c4a702c201b-kube-api-access-2cqd9\") pod \"heat-operator-controller-manager-cf99c678f-jnmbv\" (UID: \"7295db10-1c36-4c17-bf1e-4c4a702c201b\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.977054 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.983202 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.988920 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.989522 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.001556 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.009967 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-xj97p" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.015344 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.029692 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkgj9\" (UniqueName: \"kubernetes.io/projected/725c0dd0-07d1-4a1c-b223-e8bec76cc7ff-kube-api-access-lkgj9\") pod \"designate-operator-controller-manager-5d87c9d997-cmtpc\" (UID: \"725c0dd0-07d1-4a1c-b223-e8bec76cc7ff\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.039054 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.049248 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.049993 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcznz\" (UniqueName: \"kubernetes.io/projected/5cd7eb92-2fae-4978-a5e9-58fa87c63e84-kube-api-access-bcznz\") pod \"manila-operator-controller-manager-7c7bcbc569-qhpvs\" (UID: \"5cd7eb92-2fae-4978-a5e9-58fa87c63e84\") " pod="openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.050055 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-652xw\" (UniqueName: \"kubernetes.io/projected/5bd11a3c-79f3-4aa2-9d5a-78ec3c18cb31-kube-api-access-652xw\") pod \"ironic-operator-controller-manager-545456dc4-hvpbz\" (UID: \"5bd11a3c-79f3-4aa2-9d5a-78ec3c18cb31\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.050097 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cqd9\" (UniqueName: \"kubernetes.io/projected/7295db10-1c36-4c17-bf1e-4c4a702c201b-kube-api-access-2cqd9\") pod \"heat-operator-controller-manager-cf99c678f-jnmbv\" (UID: \"7295db10-1c36-4c17-bf1e-4c4a702c201b\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.050124 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7z99\" (UniqueName: \"kubernetes.io/projected/bfda7896-83e3-407c-9eb5-74fbc11104f0-kube-api-access-h7z99\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.050150 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.050181 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkvqv\" (UniqueName: \"kubernetes.io/projected/3da43711-be34-4189-b686-e8e9bc9e7265-kube-api-access-dkvqv\") pod \"horizon-operator-controller-manager-78bc7f9bd9-5xc2s\" (UID: \"3da43711-be34-4189-b686-e8e9bc9e7265\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.058967 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95kb4\" (UniqueName: \"kubernetes.io/projected/488ff419-d889-4778-96cf-a11006c49507-kube-api-access-95kb4\") pod \"glance-operator-controller-manager-64db6967f8-mjf6m\" (UID: \"488ff419-d889-4778-96cf-a11006c49507\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.088526 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.089385 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.098998 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.101622 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.121908 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkvqv\" (UniqueName: \"kubernetes.io/projected/3da43711-be34-4189-b686-e8e9bc9e7265-kube-api-access-dkvqv\") pod \"horizon-operator-controller-manager-78bc7f9bd9-5xc2s\" (UID: \"3da43711-be34-4189-b686-e8e9bc9e7265\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.136405 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-zqsnk" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.142590 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cqd9\" (UniqueName: \"kubernetes.io/projected/7295db10-1c36-4c17-bf1e-4c4a702c201b-kube-api-access-2cqd9\") pod \"heat-operator-controller-manager-cf99c678f-jnmbv\" (UID: \"7295db10-1c36-4c17-bf1e-4c4a702c201b\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.152556 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcznz\" (UniqueName: \"kubernetes.io/projected/5cd7eb92-2fae-4978-a5e9-58fa87c63e84-kube-api-access-bcznz\") pod \"manila-operator-controller-manager-7c7bcbc569-qhpvs\" (UID: \"5cd7eb92-2fae-4978-a5e9-58fa87c63e84\") " pod="openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.152640 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-652xw\" (UniqueName: \"kubernetes.io/projected/5bd11a3c-79f3-4aa2-9d5a-78ec3c18cb31-kube-api-access-652xw\") pod \"ironic-operator-controller-manager-545456dc4-hvpbz\" (UID: \"5bd11a3c-79f3-4aa2-9d5a-78ec3c18cb31\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.152816 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7z99\" (UniqueName: \"kubernetes.io/projected/bfda7896-83e3-407c-9eb5-74fbc11104f0-kube-api-access-h7z99\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.152855 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flbm2\" (UniqueName: \"kubernetes.io/projected/32eb5815-c566-4177-8b47-f756807d4a30-kube-api-access-flbm2\") pod \"keystone-operator-controller-manager-7c789f89c6-wv2rp\" (UID: \"32eb5815-c566-4177-8b47-f756807d4a30\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.152895 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:09 crc kubenswrapper[4764]: E0309 13:38:09.153080 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 13:38:09 crc kubenswrapper[4764]: E0309 13:38:09.153144 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert podName:bfda7896-83e3-407c-9eb5-74fbc11104f0 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:09.653123095 +0000 UTC m=+1044.903295003 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert") pod "infra-operator-controller-manager-5995f4446f-m58s9" (UID: "bfda7896-83e3-407c-9eb5-74fbc11104f0") : secret "infra-operator-webhook-server-cert" not found Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.170630 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.201635 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.202170 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-cgv66"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.203563 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-cgv66" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.205947 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7z99\" (UniqueName: \"kubernetes.io/projected/bfda7896-83e3-407c-9eb5-74fbc11104f0-kube-api-access-h7z99\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.259431 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-652xw\" (UniqueName: \"kubernetes.io/projected/5bd11a3c-79f3-4aa2-9d5a-78ec3c18cb31-kube-api-access-652xw\") pod \"ironic-operator-controller-manager-545456dc4-hvpbz\" (UID: \"5bd11a3c-79f3-4aa2-9d5a-78ec3c18cb31\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.260306 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.261543 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.262586 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flbm2\" (UniqueName: \"kubernetes.io/projected/32eb5815-c566-4177-8b47-f756807d4a30-kube-api-access-flbm2\") pod \"keystone-operator-controller-manager-7c789f89c6-wv2rp\" (UID: \"32eb5815-c566-4177-8b47-f756807d4a30\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.262712 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rf24\" (UniqueName: \"kubernetes.io/projected/2ddf1e89-9c89-4052-aa1b-6fb84438b86d-kube-api-access-5rf24\") pod \"neutron-operator-controller-manager-54688575f-cgv66\" (UID: \"2ddf1e89-9c89-4052-aa1b-6fb84438b86d\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-cgv66" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.268276 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-bhzwg" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.269503 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.272263 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.273306 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.273706 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-jd85k" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.274364 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcznz\" (UniqueName: \"kubernetes.io/projected/5cd7eb92-2fae-4978-a5e9-58fa87c63e84-kube-api-access-bcznz\") pod \"manila-operator-controller-manager-7c7bcbc569-qhpvs\" (UID: \"5cd7eb92-2fae-4978-a5e9-58fa87c63e84\") " pod="openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.278074 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-68h76" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.289726 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-cgv66"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.307899 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.325758 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.328345 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flbm2\" (UniqueName: \"kubernetes.io/projected/32eb5815-c566-4177-8b47-f756807d4a30-kube-api-access-flbm2\") pod \"keystone-operator-controller-manager-7c789f89c6-wv2rp\" (UID: \"32eb5815-c566-4177-8b47-f756807d4a30\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.363811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rf24\" (UniqueName: \"kubernetes.io/projected/2ddf1e89-9c89-4052-aa1b-6fb84438b86d-kube-api-access-5rf24\") pod \"neutron-operator-controller-manager-54688575f-cgv66\" (UID: \"2ddf1e89-9c89-4052-aa1b-6fb84438b86d\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-cgv66" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.364868 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6jd4\" (UniqueName: \"kubernetes.io/projected/26535a82-8d70-4623-b2b4-7dd1546d48d6-kube-api-access-d6jd4\") pod \"nova-operator-controller-manager-74b6b5dc96-dm7rn\" (UID: \"26535a82-8d70-4623-b2b4-7dd1546d48d6\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.364957 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvk8m\" (UniqueName: \"kubernetes.io/projected/da851ddd-2b27-45f0-b149-de32ae21ad91-kube-api-access-cvk8m\") pod \"mariadb-operator-controller-manager-7b6bfb6475-vkns5\" (UID: \"da851ddd-2b27-45f0-b149-de32ae21ad91\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.398416 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rf24\" (UniqueName: \"kubernetes.io/projected/2ddf1e89-9c89-4052-aa1b-6fb84438b86d-kube-api-access-5rf24\") pod \"neutron-operator-controller-manager-54688575f-cgv66\" (UID: \"2ddf1e89-9c89-4052-aa1b-6fb84438b86d\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-cgv66" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.398863 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.400206 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.413443 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.413859 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.424155 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-92kgv" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.437452 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.438576 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.449410 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-bpf6g" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.453082 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.461117 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.466761 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvk8m\" (UniqueName: \"kubernetes.io/projected/da851ddd-2b27-45f0-b149-de32ae21ad91-kube-api-access-cvk8m\") pod \"mariadb-operator-controller-manager-7b6bfb6475-vkns5\" (UID: \"da851ddd-2b27-45f0-b149-de32ae21ad91\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.466806 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz7h2\" (UniqueName: \"kubernetes.io/projected/615473d3-072e-4685-8f32-73a44badf1e2-kube-api-access-lz7h2\") pod \"ovn-operator-controller-manager-75684d597f-jfgzw\" (UID: \"615473d3-072e-4685-8f32-73a44badf1e2\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.466895 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6jd4\" (UniqueName: \"kubernetes.io/projected/26535a82-8d70-4623-b2b4-7dd1546d48d6-kube-api-access-d6jd4\") pod \"nova-operator-controller-manager-74b6b5dc96-dm7rn\" (UID: \"26535a82-8d70-4623-b2b4-7dd1546d48d6\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.466935 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms9zj\" (UniqueName: \"kubernetes.io/projected/b54e2237-603a-44ad-a129-04736cf749b2-kube-api-access-ms9zj\") pod \"octavia-operator-controller-manager-5d86c7ddb7-gv2sm\" (UID: \"b54e2237-603a-44ad-a129-04736cf749b2\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.472406 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.485755 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.489017 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.489224 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-vftv5" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.490876 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.497142 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-bxjrz" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.509738 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.512656 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.517723 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvk8m\" (UniqueName: \"kubernetes.io/projected/da851ddd-2b27-45f0-b149-de32ae21ad91-kube-api-access-cvk8m\") pod \"mariadb-operator-controller-manager-7b6bfb6475-vkns5\" (UID: \"da851ddd-2b27-45f0-b149-de32ae21ad91\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.518208 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-rgfgd" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.543678 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.545003 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.555455 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.566124 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-ftds8" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.566275 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6jd4\" (UniqueName: \"kubernetes.io/projected/26535a82-8d70-4623-b2b4-7dd1546d48d6-kube-api-access-d6jd4\") pod \"nova-operator-controller-manager-74b6b5dc96-dm7rn\" (UID: \"26535a82-8d70-4623-b2b4-7dd1546d48d6\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.569142 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz7h2\" (UniqueName: \"kubernetes.io/projected/615473d3-072e-4685-8f32-73a44badf1e2-kube-api-access-lz7h2\") pod \"ovn-operator-controller-manager-75684d597f-jfgzw\" (UID: \"615473d3-072e-4685-8f32-73a44badf1e2\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.569284 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blgqv\" (UniqueName: \"kubernetes.io/projected/c6a51c61-770b-4b46-9dd3-1f61e6f5ebc8-kube-api-access-blgqv\") pod \"telemetry-operator-controller-manager-5fdb694969-4cpsz\" (UID: \"c6a51c61-770b-4b46-9dd3-1f61e6f5ebc8\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.569360 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdwmb\" (UniqueName: \"kubernetes.io/projected/47bd7072-a414-4ce8-800b-753b7054be23-kube-api-access-xdwmb\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.569385 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhwj4\" (UniqueName: \"kubernetes.io/projected/c44e76b2-0de9-4a5b-93ee-536c6300157f-kube-api-access-fhwj4\") pod \"placement-operator-controller-manager-648564c9fc-8ms5w\" (UID: \"c44e76b2-0de9-4a5b-93ee-536c6300157f\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.569491 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms9zj\" (UniqueName: \"kubernetes.io/projected/b54e2237-603a-44ad-a129-04736cf749b2-kube-api-access-ms9zj\") pod \"octavia-operator-controller-manager-5d86c7ddb7-gv2sm\" (UID: \"b54e2237-603a-44ad-a129-04736cf749b2\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.569556 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.569599 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhthv\" (UniqueName: \"kubernetes.io/projected/003210d3-5572-44bd-aae5-d5e24aac16a5-kube-api-access-rhthv\") pod \"swift-operator-controller-manager-9b9ff9f4d-bf8w8\" (UID: \"003210d3-5572-44bd-aae5-d5e24aac16a5\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.569501 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.583193 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.583228 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.598416 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz7h2\" (UniqueName: \"kubernetes.io/projected/615473d3-072e-4685-8f32-73a44badf1e2-kube-api-access-lz7h2\") pod \"ovn-operator-controller-manager-75684d597f-jfgzw\" (UID: \"615473d3-072e-4685-8f32-73a44badf1e2\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.601921 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.605037 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms9zj\" (UniqueName: \"kubernetes.io/projected/b54e2237-603a-44ad-a129-04736cf749b2-kube-api-access-ms9zj\") pod \"octavia-operator-controller-manager-5d86c7ddb7-gv2sm\" (UID: \"b54e2237-603a-44ad-a129-04736cf749b2\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.629357 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.634218 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.642393 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-qrhxx" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.658018 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-cgv66" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.660445 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.670751 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.671564 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.671989 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blgqv\" (UniqueName: \"kubernetes.io/projected/c6a51c61-770b-4b46-9dd3-1f61e6f5ebc8-kube-api-access-blgqv\") pod \"telemetry-operator-controller-manager-5fdb694969-4cpsz\" (UID: \"c6a51c61-770b-4b46-9dd3-1f61e6f5ebc8\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.672019 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdwmb\" (UniqueName: \"kubernetes.io/projected/47bd7072-a414-4ce8-800b-753b7054be23-kube-api-access-xdwmb\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.672038 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhwj4\" (UniqueName: \"kubernetes.io/projected/c44e76b2-0de9-4a5b-93ee-536c6300157f-kube-api-access-fhwj4\") pod \"placement-operator-controller-manager-648564c9fc-8ms5w\" (UID: \"c44e76b2-0de9-4a5b-93ee-536c6300157f\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.672091 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.672143 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.672172 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhthv\" (UniqueName: \"kubernetes.io/projected/003210d3-5572-44bd-aae5-d5e24aac16a5-kube-api-access-rhthv\") pod \"swift-operator-controller-manager-9b9ff9f4d-bf8w8\" (UID: \"003210d3-5572-44bd-aae5-d5e24aac16a5\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.672211 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsvnd\" (UniqueName: \"kubernetes.io/projected/867908a2-f085-4f3d-b569-84c915f730b1-kube-api-access-gsvnd\") pod \"test-operator-controller-manager-55b5ff4dbb-d65xp\" (UID: \"867908a2-f085-4f3d-b569-84c915f730b1\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp" Mar 09 13:38:09 crc kubenswrapper[4764]: E0309 13:38:09.672481 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 13:38:09 crc kubenswrapper[4764]: E0309 13:38:09.672528 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert podName:bfda7896-83e3-407c-9eb5-74fbc11104f0 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:10.67251234 +0000 UTC m=+1045.922684248 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert") pod "infra-operator-controller-manager-5995f4446f-m58s9" (UID: "bfda7896-83e3-407c-9eb5-74fbc11104f0") : secret "infra-operator-webhook-server-cert" not found Mar 09 13:38:09 crc kubenswrapper[4764]: E0309 13:38:09.673001 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:09 crc kubenswrapper[4764]: E0309 13:38:09.673030 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert podName:47bd7072-a414-4ce8-800b-753b7054be23 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:10.173020753 +0000 UTC m=+1045.423192671 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" (UID: "47bd7072-a414-4ce8-800b-753b7054be23") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.675539 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.675719 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.682179 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-lg2pt" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.695215 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.705125 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blgqv\" (UniqueName: \"kubernetes.io/projected/c6a51c61-770b-4b46-9dd3-1f61e6f5ebc8-kube-api-access-blgqv\") pod \"telemetry-operator-controller-manager-5fdb694969-4cpsz\" (UID: \"c6a51c61-770b-4b46-9dd3-1f61e6f5ebc8\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.716864 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdwmb\" (UniqueName: \"kubernetes.io/projected/47bd7072-a414-4ce8-800b-753b7054be23-kube-api-access-xdwmb\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.718370 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhwj4\" (UniqueName: \"kubernetes.io/projected/c44e76b2-0de9-4a5b-93ee-536c6300157f-kube-api-access-fhwj4\") pod \"placement-operator-controller-manager-648564c9fc-8ms5w\" (UID: \"c44e76b2-0de9-4a5b-93ee-536c6300157f\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.724191 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhthv\" (UniqueName: \"kubernetes.io/projected/003210d3-5572-44bd-aae5-d5e24aac16a5-kube-api-access-rhthv\") pod \"swift-operator-controller-manager-9b9ff9f4d-bf8w8\" (UID: \"003210d3-5572-44bd-aae5-d5e24aac16a5\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.748395 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.750482 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.769049 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.779703 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.780088 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-p7k4x" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.780305 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.783142 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsvnd\" (UniqueName: \"kubernetes.io/projected/867908a2-f085-4f3d-b569-84c915f730b1-kube-api-access-gsvnd\") pod \"test-operator-controller-manager-55b5ff4dbb-d65xp\" (UID: \"867908a2-f085-4f3d-b569-84c915f730b1\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.822447 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.831141 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsvnd\" (UniqueName: \"kubernetes.io/projected/867908a2-f085-4f3d-b569-84c915f730b1-kube-api-access-gsvnd\") pod \"test-operator-controller-manager-55b5ff4dbb-d65xp\" (UID: \"867908a2-f085-4f3d-b569-84c915f730b1\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.848334 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.850527 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.857031 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.860071 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.860877 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-w5dcl" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.880696 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.884276 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.884343 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vpnw\" (UniqueName: \"kubernetes.io/projected/e11f44d8-58a5-4fc7-b05b-e2e688647d01-kube-api-access-6vpnw\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.884372 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5nrw\" (UniqueName: \"kubernetes.io/projected/f705ec78-e960-4200-b5a6-f3d4310f1bd5-kube-api-access-h5nrw\") pod \"watcher-operator-controller-manager-bccc79885-7f8nr\" (UID: \"f705ec78-e960-4200-b5a6-f3d4310f1bd5\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.884754 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.902472 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.940180 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.974020 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" Mar 09 13:38:09 crc kubenswrapper[4764]: W0309 13:38:09.986119 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode220a3f1_4dbe_4ee6_9b19_26985fa998cf.slice/crio-436d33e370c3500040ecc542247302bce24a8a00ee337b46e2f406a44bdaa218 WatchSource:0}: Error finding container 436d33e370c3500040ecc542247302bce24a8a00ee337b46e2f406a44bdaa218: Status 404 returned error can't find the container with id 436d33e370c3500040ecc542247302bce24a8a00ee337b46e2f406a44bdaa218 Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.987017 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb6g7\" (UniqueName: \"kubernetes.io/projected/01ea99aa-eb21-4799-9557-42c3fb55945a-kube-api-access-kb6g7\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6v2sq\" (UID: \"01ea99aa-eb21-4799-9557-42c3fb55945a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.987068 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.987102 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vpnw\" (UniqueName: \"kubernetes.io/projected/e11f44d8-58a5-4fc7-b05b-e2e688647d01-kube-api-access-6vpnw\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.987123 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5nrw\" (UniqueName: \"kubernetes.io/projected/f705ec78-e960-4200-b5a6-f3d4310f1bd5-kube-api-access-h5nrw\") pod \"watcher-operator-controller-manager-bccc79885-7f8nr\" (UID: \"f705ec78-e960-4200-b5a6-f3d4310f1bd5\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.987199 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:09 crc kubenswrapper[4764]: E0309 13:38:09.989428 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 13:38:09 crc kubenswrapper[4764]: E0309 13:38:09.989497 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:10.489471456 +0000 UTC m=+1045.739643364 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "webhook-server-cert" not found Mar 09 13:38:09 crc kubenswrapper[4764]: E0309 13:38:09.990160 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 13:38:09 crc kubenswrapper[4764]: E0309 13:38:09.990187 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:10.490178754 +0000 UTC m=+1045.740350662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "metrics-server-cert" not found Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.999154 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.017354 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vpnw\" (UniqueName: \"kubernetes.io/projected/e11f44d8-58a5-4fc7-b05b-e2e688647d01-kube-api-access-6vpnw\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.024362 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5nrw\" (UniqueName: \"kubernetes.io/projected/f705ec78-e960-4200-b5a6-f3d4310f1bd5-kube-api-access-h5nrw\") pod \"watcher-operator-controller-manager-bccc79885-7f8nr\" (UID: \"f705ec78-e960-4200-b5a6-f3d4310f1bd5\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.089879 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb6g7\" (UniqueName: \"kubernetes.io/projected/01ea99aa-eb21-4799-9557-42c3fb55945a-kube-api-access-kb6g7\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6v2sq\" (UID: \"01ea99aa-eb21-4799-9557-42c3fb55945a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.116259 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.117205 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb6g7\" (UniqueName: \"kubernetes.io/projected/01ea99aa-eb21-4799-9557-42c3fb55945a-kube-api-access-kb6g7\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6v2sq\" (UID: \"01ea99aa-eb21-4799-9557-42c3fb55945a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.128731 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" Mar 09 13:38:10 crc kubenswrapper[4764]: W0309 13:38:10.165417 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c271ca0_0c25_46d1_b730_e94f68397e29.slice/crio-453cd0556a5a9d4e58f08c125aa16fcec22f7fbc83cd351c18f798adf3681e01 WatchSource:0}: Error finding container 453cd0556a5a9d4e58f08c125aa16fcec22f7fbc83cd351c18f798adf3681e01: Status 404 returned error can't find the container with id 453cd0556a5a9d4e58f08c125aa16fcec22f7fbc83cd351c18f798adf3681e01 Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.191992 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:10 crc kubenswrapper[4764]: E0309 13:38:10.192146 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:10 crc kubenswrapper[4764]: E0309 13:38:10.192196 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert podName:47bd7072-a414-4ce8-800b-753b7054be23 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:11.192179772 +0000 UTC m=+1046.442351680 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" (UID: "47bd7072-a414-4ce8-800b-753b7054be23") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.220045 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.499036 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:10 crc kubenswrapper[4764]: E0309 13:38:10.499160 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.499176 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:10 crc kubenswrapper[4764]: E0309 13:38:10.499216 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:11.499200679 +0000 UTC m=+1046.749372587 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "webhook-server-cert" not found Mar 09 13:38:10 crc kubenswrapper[4764]: E0309 13:38:10.499258 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 13:38:10 crc kubenswrapper[4764]: E0309 13:38:10.499288 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:11.499278651 +0000 UTC m=+1046.749450559 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "metrics-server-cert" not found Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.539356 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.558060 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.572978 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.585441 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.597098 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.660757 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.667327 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz"] Mar 09 13:38:10 crc kubenswrapper[4764]: W0309 13:38:10.690448 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32eb5815_c566_4177_8b47_f756807d4a30.slice/crio-e1f885f85ae4d97874d821cbea9fa8a3ef4927592a3615d22ee85fdec3b4a333 WatchSource:0}: Error finding container e1f885f85ae4d97874d821cbea9fa8a3ef4927592a3615d22ee85fdec3b4a333: Status 404 returned error can't find the container with id e1f885f85ae4d97874d821cbea9fa8a3ef4927592a3615d22ee85fdec3b4a333 Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.691483 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.704994 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:10 crc kubenswrapper[4764]: E0309 13:38:10.705155 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 13:38:10 crc kubenswrapper[4764]: E0309 13:38:10.705248 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert podName:bfda7896-83e3-407c-9eb5-74fbc11104f0 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:12.705223502 +0000 UTC m=+1047.955395410 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert") pod "infra-operator-controller-manager-5995f4446f-m58s9" (UID: "bfda7896-83e3-407c-9eb5-74fbc11104f0") : secret "infra-operator-webhook-server-cert" not found Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.806788 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.819011 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.820019 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" event={"ID":"4c271ca0-0c25-46d1-b730-e94f68397e29","Type":"ContainerStarted","Data":"453cd0556a5a9d4e58f08c125aa16fcec22f7fbc83cd351c18f798adf3681e01"} Mar 09 13:38:10 crc kubenswrapper[4764]: W0309 13:38:10.821088 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26535a82_8d70_4623_b2b4_7dd1546d48d6.slice/crio-532fdf9d88bb4997585ea9f96d3af86c9cc2f6e163a943806143b437035a35e0 WatchSource:0}: Error finding container 532fdf9d88bb4997585ea9f96d3af86c9cc2f6e163a943806143b437035a35e0: Status 404 returned error can't find the container with id 532fdf9d88bb4997585ea9f96d3af86c9cc2f6e163a943806143b437035a35e0 Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.824861 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs" event={"ID":"5cd7eb92-2fae-4978-a5e9-58fa87c63e84","Type":"ContainerStarted","Data":"2f2e45ecb699d7bde9da9d5f11e44d4e5a917a68f0bd3a50c818220f5192f006"} Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.828985 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz" event={"ID":"5bd11a3c-79f3-4aa2-9d5a-78ec3c18cb31","Type":"ContainerStarted","Data":"21e4357340aa7356c17959b990234c44d54f135747e955392796bf9e53c7ba5d"} Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.835930 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-cgv66"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.839131 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz" event={"ID":"c6a51c61-770b-4b46-9dd3-1f61e6f5ebc8","Type":"ContainerStarted","Data":"956bd835dec5019d146939cac22529a5e7bb52fdc032f62c2fd46668000f4d84"} Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.863487 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.863550 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" event={"ID":"32eb5815-c566-4177-8b47-f756807d4a30","Type":"ContainerStarted","Data":"e1f885f85ae4d97874d821cbea9fa8a3ef4927592a3615d22ee85fdec3b4a333"} Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.864634 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc" event={"ID":"725c0dd0-07d1-4a1c-b223-e8bec76cc7ff","Type":"ContainerStarted","Data":"e8a90e91a245e03595ceb2876d19da11011a1c5798d0035edbfd9c252f1c077d"} Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.865684 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" event={"ID":"e220a3f1-4dbe-4ee6-9b19-26985fa998cf","Type":"ContainerStarted","Data":"436d33e370c3500040ecc542247302bce24a8a00ee337b46e2f406a44bdaa218"} Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.866619 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s" event={"ID":"3da43711-be34-4189-b686-e8e9bc9e7265","Type":"ContainerStarted","Data":"782aacb8ac9540eac4995bb178b4be34655e33dd40680d4d82147b011c3bd1be"} Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.867538 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" event={"ID":"488ff419-d889-4778-96cf-a11006c49507","Type":"ContainerStarted","Data":"54b9226ff3f8aaf10350199499dd0a3615c3b73553afd9aa930ad22f1cd57f04"} Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.868618 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv" event={"ID":"7295db10-1c36-4c17-bf1e-4c4a702c201b","Type":"ContainerStarted","Data":"ec1981c22097f197cc5c9636d6e6fa3817921dadae9f13097f2fc7d2b9bf02e8"} Mar 09 13:38:10 crc kubenswrapper[4764]: W0309 13:38:10.873551 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda851ddd_2b27_45f0_b149_de32ae21ad91.slice/crio-f856db644f5da9169430c0c5fab980932b910b982e9db6e7990a9c6003480f6d WatchSource:0}: Error finding container f856db644f5da9169430c0c5fab980932b910b982e9db6e7990a9c6003480f6d: Status 404 returned error can't find the container with id f856db644f5da9169430c0c5fab980932b910b982e9db6e7990a9c6003480f6d Mar 09 13:38:10 crc kubenswrapper[4764]: E0309 13:38:10.877512 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:5592ec4a6fbe2c832d1828b51af0b907e5d733d478b6f378a9b2f6d6cf0ac505,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cvk8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-7b6bfb6475-vkns5_openstack-operators(da851ddd-2b27-45f0-b149-de32ae21ad91): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 13:38:10 crc kubenswrapper[4764]: E0309 13:38:10.879148 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" podUID="da851ddd-2b27-45f0-b149-de32ae21ad91" Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.996822 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp"] Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.007690 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8"] Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.021662 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w"] Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.032473 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rhthv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9b9ff9f4d-bf8w8_openstack-operators(003210d3-5572-44bd-aae5-d5e24aac16a5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.032761 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm"] Mar 09 13:38:11 crc kubenswrapper[4764]: W0309 13:38:11.032937 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc44e76b2_0de9_4a5b_93ee_536c6300157f.slice/crio-efbc7e66c2b8f2bdc691a5de2e13d156a4f5d3ccea628eea1e12b9e80cd0713e WatchSource:0}: Error finding container efbc7e66c2b8f2bdc691a5de2e13d156a4f5d3ccea628eea1e12b9e80cd0713e: Status 404 returned error can't find the container with id efbc7e66c2b8f2bdc691a5de2e13d156a4f5d3ccea628eea1e12b9e80cd0713e Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.033666 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" podUID="003210d3-5572-44bd-aae5-d5e24aac16a5" Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.035611 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fhwj4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-648564c9fc-8ms5w_openstack-operators(c44e76b2-0de9-4a5b-93ee-536c6300157f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.036718 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" podUID="c44e76b2-0de9-4a5b-93ee-536c6300157f" Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.042928 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq"] Mar 09 13:38:11 crc kubenswrapper[4764]: W0309 13:38:11.043208 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01ea99aa_eb21_4799_9557_42c3fb55945a.slice/crio-9bd570e87d64f9de5b72579e1811c0c122d2847d8eca056f67cefdcf7b8d3d6f WatchSource:0}: Error finding container 9bd570e87d64f9de5b72579e1811c0c122d2847d8eca056f67cefdcf7b8d3d6f: Status 404 returned error can't find the container with id 9bd570e87d64f9de5b72579e1811c0c122d2847d8eca056f67cefdcf7b8d3d6f Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.045109 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kb6g7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-6v2sq_openstack-operators(01ea99aa-eb21-4799-9557-42c3fb55945a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.047142 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" podUID="01ea99aa-eb21-4799-9557-42c3fb55945a" Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.050200 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr"] Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.054591 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ms9zj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5d86c7ddb7-gv2sm_openstack-operators(b54e2237-603a-44ad-a129-04736cf749b2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.055943 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" podUID="b54e2237-603a-44ad-a129-04736cf749b2" Mar 09 13:38:11 crc kubenswrapper[4764]: W0309 13:38:11.061050 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf705ec78_e960_4200_b5a6_f3d4310f1bd5.slice/crio-14622df55451775e15d1e15f57b322fc2931130167f7083e5b3dc7fefa133ca3 WatchSource:0}: Error finding container 14622df55451775e15d1e15f57b322fc2931130167f7083e5b3dc7fefa133ca3: Status 404 returned error can't find the container with id 14622df55451775e15d1e15f57b322fc2931130167f7083e5b3dc7fefa133ca3 Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.068923 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h5nrw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-7f8nr_openstack-operators(f705ec78-e960-4200-b5a6-f3d4310f1bd5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.070073 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" podUID="f705ec78-e960-4200-b5a6-f3d4310f1bd5" Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.213191 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.213455 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.213530 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert podName:47bd7072-a414-4ce8-800b-753b7054be23 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:13.213509376 +0000 UTC m=+1048.463681284 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" (UID: "47bd7072-a414-4ce8-800b-753b7054be23") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.517388 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.517514 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.517710 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.517775 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:13.51775644 +0000 UTC m=+1048.767928348 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "webhook-server-cert" not found Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.518200 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.518234 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:13.518221903 +0000 UTC m=+1048.768393811 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "metrics-server-cert" not found Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.889291 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp" event={"ID":"867908a2-f085-4f3d-b569-84c915f730b1","Type":"ContainerStarted","Data":"42c5e3cad1bdf17243a5ef9fb0a15003c1b108da10c6678ed364380d3c78156d"} Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.891519 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" event={"ID":"c44e76b2-0de9-4a5b-93ee-536c6300157f","Type":"ContainerStarted","Data":"efbc7e66c2b8f2bdc691a5de2e13d156a4f5d3ccea628eea1e12b9e80cd0713e"} Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.893811 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw" event={"ID":"615473d3-072e-4685-8f32-73a44badf1e2","Type":"ContainerStarted","Data":"2c871aa692953c9072929952c5a03cf3e3ee9fe2af4f1e7c6c3ad701482442d9"} Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.893905 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" podUID="c44e76b2-0de9-4a5b-93ee-536c6300157f" Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.897231 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn" event={"ID":"26535a82-8d70-4623-b2b4-7dd1546d48d6","Type":"ContainerStarted","Data":"532fdf9d88bb4997585ea9f96d3af86c9cc2f6e163a943806143b437035a35e0"} Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.900977 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" event={"ID":"01ea99aa-eb21-4799-9557-42c3fb55945a","Type":"ContainerStarted","Data":"9bd570e87d64f9de5b72579e1811c0c122d2847d8eca056f67cefdcf7b8d3d6f"} Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.903543 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" podUID="01ea99aa-eb21-4799-9557-42c3fb55945a" Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.903609 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" event={"ID":"b54e2237-603a-44ad-a129-04736cf749b2","Type":"ContainerStarted","Data":"d354f5f451b0d3f76f0897ebe74945c06349882a6ce0bb157af5c3891bd652dd"} Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.907833 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" event={"ID":"da851ddd-2b27-45f0-b149-de32ae21ad91","Type":"ContainerStarted","Data":"f856db644f5da9169430c0c5fab980932b910b982e9db6e7990a9c6003480f6d"} Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.908047 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" podUID="b54e2237-603a-44ad-a129-04736cf749b2" Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.910772 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5592ec4a6fbe2c832d1828b51af0b907e5d733d478b6f378a9b2f6d6cf0ac505\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" podUID="da851ddd-2b27-45f0-b149-de32ae21ad91" Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.914678 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-cgv66" event={"ID":"2ddf1e89-9c89-4052-aa1b-6fb84438b86d","Type":"ContainerStarted","Data":"531be4af8b9aa765e5a8e039032a605d6f34837c3864b85c4da7c3cd37d61378"} Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.924509 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" event={"ID":"003210d3-5572-44bd-aae5-d5e24aac16a5","Type":"ContainerStarted","Data":"463fcdec7f496474c27aba53d573e8b949915bf1feb7cf322e622bc5b56ad357"} Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.932050 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" event={"ID":"f705ec78-e960-4200-b5a6-f3d4310f1bd5","Type":"ContainerStarted","Data":"14622df55451775e15d1e15f57b322fc2931130167f7083e5b3dc7fefa133ca3"} Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.933569 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7\\\"\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" podUID="003210d3-5572-44bd-aae5-d5e24aac16a5" Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.934130 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" podUID="f705ec78-e960-4200-b5a6-f3d4310f1bd5" Mar 09 13:38:12 crc kubenswrapper[4764]: I0309 13:38:12.740464 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:12 crc kubenswrapper[4764]: E0309 13:38:12.742153 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 13:38:12 crc kubenswrapper[4764]: E0309 13:38:12.742232 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert podName:bfda7896-83e3-407c-9eb5-74fbc11104f0 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:16.742207883 +0000 UTC m=+1051.992379791 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert") pod "infra-operator-controller-manager-5995f4446f-m58s9" (UID: "bfda7896-83e3-407c-9eb5-74fbc11104f0") : secret "infra-operator-webhook-server-cert" not found Mar 09 13:38:12 crc kubenswrapper[4764]: E0309 13:38:12.960451 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" podUID="f705ec78-e960-4200-b5a6-f3d4310f1bd5" Mar 09 13:38:12 crc kubenswrapper[4764]: E0309 13:38:12.960587 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" podUID="01ea99aa-eb21-4799-9557-42c3fb55945a" Mar 09 13:38:12 crc kubenswrapper[4764]: E0309 13:38:12.960683 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" podUID="b54e2237-603a-44ad-a129-04736cf749b2" Mar 09 13:38:12 crc kubenswrapper[4764]: E0309 13:38:12.960767 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" podUID="c44e76b2-0de9-4a5b-93ee-536c6300157f" Mar 09 13:38:12 crc kubenswrapper[4764]: E0309 13:38:12.960880 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5592ec4a6fbe2c832d1828b51af0b907e5d733d478b6f378a9b2f6d6cf0ac505\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" podUID="da851ddd-2b27-45f0-b149-de32ae21ad91" Mar 09 13:38:12 crc kubenswrapper[4764]: E0309 13:38:12.960935 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7\\\"\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" podUID="003210d3-5572-44bd-aae5-d5e24aac16a5" Mar 09 13:38:13 crc kubenswrapper[4764]: I0309 13:38:13.252254 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:13 crc kubenswrapper[4764]: E0309 13:38:13.252776 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:13 crc kubenswrapper[4764]: E0309 13:38:13.252886 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert podName:47bd7072-a414-4ce8-800b-753b7054be23 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:17.2528613 +0000 UTC m=+1052.503033208 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" (UID: "47bd7072-a414-4ce8-800b-753b7054be23") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:13 crc kubenswrapper[4764]: I0309 13:38:13.557441 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:13 crc kubenswrapper[4764]: I0309 13:38:13.557576 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:13 crc kubenswrapper[4764]: E0309 13:38:13.557787 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 13:38:13 crc kubenswrapper[4764]: E0309 13:38:13.557855 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:17.557835143 +0000 UTC m=+1052.808007061 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "metrics-server-cert" not found Mar 09 13:38:13 crc kubenswrapper[4764]: E0309 13:38:13.557989 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 13:38:13 crc kubenswrapper[4764]: E0309 13:38:13.558109 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:17.55808276 +0000 UTC m=+1052.808254668 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "webhook-server-cert" not found Mar 09 13:38:16 crc kubenswrapper[4764]: I0309 13:38:16.824433 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:16 crc kubenswrapper[4764]: E0309 13:38:16.824619 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 13:38:16 crc kubenswrapper[4764]: E0309 13:38:16.824719 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert podName:bfda7896-83e3-407c-9eb5-74fbc11104f0 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:24.824696588 +0000 UTC m=+1060.074868496 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert") pod "infra-operator-controller-manager-5995f4446f-m58s9" (UID: "bfda7896-83e3-407c-9eb5-74fbc11104f0") : secret "infra-operator-webhook-server-cert" not found Mar 09 13:38:17 crc kubenswrapper[4764]: I0309 13:38:17.334073 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:17 crc kubenswrapper[4764]: E0309 13:38:17.334317 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:17 crc kubenswrapper[4764]: E0309 13:38:17.334388 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert podName:47bd7072-a414-4ce8-800b-753b7054be23 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:25.334371289 +0000 UTC m=+1060.584543187 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" (UID: "47bd7072-a414-4ce8-800b-753b7054be23") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:17 crc kubenswrapper[4764]: I0309 13:38:17.639186 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:17 crc kubenswrapper[4764]: I0309 13:38:17.639274 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:17 crc kubenswrapper[4764]: E0309 13:38:17.639364 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 13:38:17 crc kubenswrapper[4764]: E0309 13:38:17.639388 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 13:38:17 crc kubenswrapper[4764]: E0309 13:38:17.639423 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:25.639409694 +0000 UTC m=+1060.889581602 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "webhook-server-cert" not found Mar 09 13:38:17 crc kubenswrapper[4764]: E0309 13:38:17.639438 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:25.639432475 +0000 UTC m=+1060.889604383 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "metrics-server-cert" not found Mar 09 13:38:22 crc kubenswrapper[4764]: E0309 13:38:22.687798 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:3f9b0446a124745439306dc3bb7faec8c02c0b6be33f788b9d455fa57fb60120" Mar 09 13:38:22 crc kubenswrapper[4764]: E0309 13:38:22.688596 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:3f9b0446a124745439306dc3bb7faec8c02c0b6be33f788b9d455fa57fb60120,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-znvfv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-6db6876945-82cg8_openstack-operators(e220a3f1-4dbe-4ee6-9b19-26985fa998cf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:38:22 crc kubenswrapper[4764]: E0309 13:38:22.689772 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" podUID="e220a3f1-4dbe-4ee6-9b19-26985fa998cf" Mar 09 13:38:23 crc kubenswrapper[4764]: E0309 13:38:23.052920 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:3f9b0446a124745439306dc3bb7faec8c02c0b6be33f788b9d455fa57fb60120\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" podUID="e220a3f1-4dbe-4ee6-9b19-26985fa998cf" Mar 09 13:38:23 crc kubenswrapper[4764]: E0309 13:38:23.351871 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:7961c67cfc87de69055f8330771af625f73d857426c4bb17ebb888ead843fff3" Mar 09 13:38:23 crc kubenswrapper[4764]: E0309 13:38:23.352187 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:7961c67cfc87de69055f8330771af625f73d857426c4bb17ebb888ead843fff3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lnz4t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-55d77d7b5c-nppjq_openstack-operators(4c271ca0-0c25-46d1-b730-e94f68397e29): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:38:23 crc kubenswrapper[4764]: E0309 13:38:23.353354 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" podUID="4c271ca0-0c25-46d1-b730-e94f68397e29" Mar 09 13:38:24 crc kubenswrapper[4764]: E0309 13:38:24.010688 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:81e43c058d9af1d3bc31704010c630bc2a574c2ee388aa0ffe8c7b9621a7d051" Mar 09 13:38:24 crc kubenswrapper[4764]: E0309 13:38:24.012334 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:81e43c058d9af1d3bc31704010c630bc2a574c2ee388aa0ffe8c7b9621a7d051,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-95kb4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-64db6967f8-mjf6m_openstack-operators(488ff419-d889-4778-96cf-a11006c49507): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:38:24 crc kubenswrapper[4764]: E0309 13:38:24.013600 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" podUID="488ff419-d889-4778-96cf-a11006c49507" Mar 09 13:38:24 crc kubenswrapper[4764]: E0309 13:38:24.064231 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:7961c67cfc87de69055f8330771af625f73d857426c4bb17ebb888ead843fff3\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" podUID="4c271ca0-0c25-46d1-b730-e94f68397e29" Mar 09 13:38:24 crc kubenswrapper[4764]: E0309 13:38:24.064601 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:81e43c058d9af1d3bc31704010c630bc2a574c2ee388aa0ffe8c7b9621a7d051\\\"\"" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" podUID="488ff419-d889-4778-96cf-a11006c49507" Mar 09 13:38:24 crc kubenswrapper[4764]: E0309 13:38:24.504800 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c" Mar 09 13:38:24 crc kubenswrapper[4764]: E0309 13:38:24.505025 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-flbm2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7c789f89c6-wv2rp_openstack-operators(32eb5815-c566-4177-8b47-f756807d4a30): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:38:24 crc kubenswrapper[4764]: E0309 13:38:24.506106 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" podUID="32eb5815-c566-4177-8b47-f756807d4a30" Mar 09 13:38:24 crc kubenswrapper[4764]: I0309 13:38:24.849535 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:24 crc kubenswrapper[4764]: I0309 13:38:24.863337 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:24 crc kubenswrapper[4764]: I0309 13:38:24.917297 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.070224 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw" event={"ID":"615473d3-072e-4685-8f32-73a44badf1e2","Type":"ContainerStarted","Data":"ae438d53b49a18ccfc8b19d558db62b8a25bf552861a59f0bb99871f82729c57"} Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.070357 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.072304 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-cgv66" event={"ID":"2ddf1e89-9c89-4052-aa1b-6fb84438b86d","Type":"ContainerStarted","Data":"045d880516f76471069e4310ea270473fbf523a5785fd893e4d36f9a22913f9e"} Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.072408 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54688575f-cgv66" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.073693 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs" event={"ID":"5cd7eb92-2fae-4978-a5e9-58fa87c63e84","Type":"ContainerStarted","Data":"0e9828b895f4ce1da3226383c620e966fe893adf71a5c073fa63a8068e4716da"} Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.073809 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.074990 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn" event={"ID":"26535a82-8d70-4623-b2b4-7dd1546d48d6","Type":"ContainerStarted","Data":"75ac040f402881ef08927b332e8dadd06a3b59ceb485dd8f6e92a2c3c453a3ee"} Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.075077 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.080832 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp" event={"ID":"867908a2-f085-4f3d-b569-84c915f730b1","Type":"ContainerStarted","Data":"e6be54291177274fb63d942fce6918f669c313a3bc8ed8821137142306cf6877"} Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.080934 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.105370 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw" podStartSLOduration=2.467226953 podStartE2EDuration="16.105349484s" podCreationTimestamp="2026-03-09 13:38:09 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.838683192 +0000 UTC m=+1046.088855100" lastFinishedPulling="2026-03-09 13:38:24.476805723 +0000 UTC m=+1059.726977631" observedRunningTime="2026-03-09 13:38:25.102982832 +0000 UTC m=+1060.353154740" watchObservedRunningTime="2026-03-09 13:38:25.105349484 +0000 UTC m=+1060.355521392" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.130564 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc" event={"ID":"725c0dd0-07d1-4a1c-b223-e8bec76cc7ff","Type":"ContainerStarted","Data":"5269a9d24de0b25114963ad28ea8ef517ecd3033e32d02bf1974160e408e95f4"} Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.131465 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.143068 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs" podStartSLOduration=3.329941973 podStartE2EDuration="17.143051867s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.663590806 +0000 UTC m=+1045.913762714" lastFinishedPulling="2026-03-09 13:38:24.4767007 +0000 UTC m=+1059.726872608" observedRunningTime="2026-03-09 13:38:25.142273387 +0000 UTC m=+1060.392445305" watchObservedRunningTime="2026-03-09 13:38:25.143051867 +0000 UTC m=+1060.393223785" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.166097 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv" event={"ID":"7295db10-1c36-4c17-bf1e-4c4a702c201b","Type":"ContainerStarted","Data":"4bc1ca006daeaf3def75bdee29704e431cff99826caa1f181218c14aa36c57d8"} Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.166794 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.189611 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn" podStartSLOduration=3.551551793 podStartE2EDuration="17.189588241s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.838664992 +0000 UTC m=+1046.088836900" lastFinishedPulling="2026-03-09 13:38:24.47670144 +0000 UTC m=+1059.726873348" observedRunningTime="2026-03-09 13:38:25.181532171 +0000 UTC m=+1060.431704079" watchObservedRunningTime="2026-03-09 13:38:25.189588241 +0000 UTC m=+1060.439760149" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.194389 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz" event={"ID":"5bd11a3c-79f3-4aa2-9d5a-78ec3c18cb31","Type":"ContainerStarted","Data":"36ae9cc3eb8d582908ec232b9caf649350d5c1ef549bff11c310a5484a27090e"} Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.195256 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.234008 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz" event={"ID":"c6a51c61-770b-4b46-9dd3-1f61e6f5ebc8","Type":"ContainerStarted","Data":"081b4d02546b89537610829d297210a1a219e9ed2e9ccc5e12eda310701d7739"} Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.234759 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.265081 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s" event={"ID":"3da43711-be34-4189-b686-e8e9bc9e7265","Type":"ContainerStarted","Data":"324c5be969cd92e40a6215cf9da106eb9e0ec3b444e208508566e41cfc6c75ee"} Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.265437 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s" Mar 09 13:38:25 crc kubenswrapper[4764]: E0309 13:38:25.273934 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" podUID="32eb5815-c566-4177-8b47-f756807d4a30" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.311485 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54688575f-cgv66" podStartSLOduration=3.695826775 podStartE2EDuration="17.311462789s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.862861663 +0000 UTC m=+1046.113033581" lastFinishedPulling="2026-03-09 13:38:24.478497677 +0000 UTC m=+1059.728669595" observedRunningTime="2026-03-09 13:38:25.303091711 +0000 UTC m=+1060.553263629" watchObservedRunningTime="2026-03-09 13:38:25.311462789 +0000 UTC m=+1060.561634717" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.328299 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp" podStartSLOduration=2.820098716 podStartE2EDuration="16.314248262s" podCreationTimestamp="2026-03-09 13:38:09 +0000 UTC" firstStartedPulling="2026-03-09 13:38:11.023204834 +0000 UTC m=+1046.273376732" lastFinishedPulling="2026-03-09 13:38:24.51735437 +0000 UTC m=+1059.767526278" observedRunningTime="2026-03-09 13:38:25.227996783 +0000 UTC m=+1060.478168691" watchObservedRunningTime="2026-03-09 13:38:25.314248262 +0000 UTC m=+1060.564420170" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.371890 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:25 crc kubenswrapper[4764]: E0309 13:38:25.372036 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:25 crc kubenswrapper[4764]: E0309 13:38:25.372084 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert podName:47bd7072-a414-4ce8-800b-753b7054be23 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:41.37206978 +0000 UTC m=+1076.622241688 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" (UID: "47bd7072-a414-4ce8-800b-753b7054be23") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.517684 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz" podStartSLOduration=3.590751344 podStartE2EDuration="17.517666316s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.578092346 +0000 UTC m=+1045.828264254" lastFinishedPulling="2026-03-09 13:38:24.505007318 +0000 UTC m=+1059.755179226" observedRunningTime="2026-03-09 13:38:25.44114258 +0000 UTC m=+1060.691314498" watchObservedRunningTime="2026-03-09 13:38:25.517666316 +0000 UTC m=+1060.767838224" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.590523 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s" podStartSLOduration=3.676560931 podStartE2EDuration="17.590498945s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.586189817 +0000 UTC m=+1045.836361725" lastFinishedPulling="2026-03-09 13:38:24.500127831 +0000 UTC m=+1059.750299739" observedRunningTime="2026-03-09 13:38:25.55501544 +0000 UTC m=+1060.805187358" watchObservedRunningTime="2026-03-09 13:38:25.590498945 +0000 UTC m=+1060.840670853" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.604688 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz" podStartSLOduration=2.777043832 podStartE2EDuration="16.604662835s" podCreationTimestamp="2026-03-09 13:38:09 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.67718224 +0000 UTC m=+1045.927354148" lastFinishedPulling="2026-03-09 13:38:24.504801243 +0000 UTC m=+1059.754973151" observedRunningTime="2026-03-09 13:38:25.58071015 +0000 UTC m=+1060.830882058" watchObservedRunningTime="2026-03-09 13:38:25.604662835 +0000 UTC m=+1060.854834743" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.662602 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv" podStartSLOduration=3.781753806 podStartE2EDuration="17.662576795s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.619062145 +0000 UTC m=+1045.869234053" lastFinishedPulling="2026-03-09 13:38:24.499885134 +0000 UTC m=+1059.750057042" observedRunningTime="2026-03-09 13:38:25.656139917 +0000 UTC m=+1060.906311835" watchObservedRunningTime="2026-03-09 13:38:25.662576795 +0000 UTC m=+1060.912748713" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.673631 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9"] Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.680964 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.681053 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:25 crc kubenswrapper[4764]: E0309 13:38:25.681184 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 13:38:25 crc kubenswrapper[4764]: E0309 13:38:25.681234 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:41.681219901 +0000 UTC m=+1076.931391799 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "metrics-server-cert" not found Mar 09 13:38:25 crc kubenswrapper[4764]: E0309 13:38:25.682137 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 13:38:25 crc kubenswrapper[4764]: E0309 13:38:25.682262 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:41.682225637 +0000 UTC m=+1076.932397545 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "webhook-server-cert" not found Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.699130 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc" podStartSLOduration=3.803510523 podStartE2EDuration="17.699106748s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.603638062 +0000 UTC m=+1045.853809960" lastFinishedPulling="2026-03-09 13:38:24.499234277 +0000 UTC m=+1059.749406185" observedRunningTime="2026-03-09 13:38:25.673231543 +0000 UTC m=+1060.923403461" watchObservedRunningTime="2026-03-09 13:38:25.699106748 +0000 UTC m=+1060.949278666" Mar 09 13:38:26 crc kubenswrapper[4764]: I0309 13:38:26.274355 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" event={"ID":"bfda7896-83e3-407c-9eb5-74fbc11104f0","Type":"ContainerStarted","Data":"95e2fad6142dad7b1e60999f1058d9a3d85f897812628e3cf85b60a66627ed9f"} Mar 09 13:38:29 crc kubenswrapper[4764]: I0309 13:38:29.042167 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc" Mar 09 13:38:29 crc kubenswrapper[4764]: I0309 13:38:29.174476 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv" Mar 09 13:38:29 crc kubenswrapper[4764]: I0309 13:38:29.209406 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s" Mar 09 13:38:29 crc kubenswrapper[4764]: I0309 13:38:29.272313 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz" Mar 09 13:38:29 crc kubenswrapper[4764]: I0309 13:38:29.419762 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs" Mar 09 13:38:29 crc kubenswrapper[4764]: I0309 13:38:29.661725 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54688575f-cgv66" Mar 09 13:38:29 crc kubenswrapper[4764]: I0309 13:38:29.662880 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn" Mar 09 13:38:29 crc kubenswrapper[4764]: I0309 13:38:29.784405 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz" Mar 09 13:38:29 crc kubenswrapper[4764]: I0309 13:38:29.883920 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp" Mar 09 13:38:29 crc kubenswrapper[4764]: I0309 13:38:29.908734 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.381271 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" event={"ID":"bfda7896-83e3-407c-9eb5-74fbc11104f0","Type":"ContainerStarted","Data":"974135c1a92e45c27a888b5eaf4c65975b81ae4d8749dd46f10b5bbf6ed4fa9b"} Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.382091 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.383135 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" event={"ID":"da851ddd-2b27-45f0-b149-de32ae21ad91","Type":"ContainerStarted","Data":"7b0e39b49694b7eca18a30e1ed0d1cc40b8b2bde68d1b29cec6e28254f36d633"} Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.383321 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.385216 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" event={"ID":"003210d3-5572-44bd-aae5-d5e24aac16a5","Type":"ContainerStarted","Data":"1794d27705384d6e9f2c74f66cae2eb2e036ea2c06eaec91e618c5c2ed9d195f"} Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.385513 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.387060 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" event={"ID":"f705ec78-e960-4200-b5a6-f3d4310f1bd5","Type":"ContainerStarted","Data":"14d155d97f6558b33920df0a1f0a2fae6e95d124a4be794d77fceb435430053f"} Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.387982 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.389743 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" event={"ID":"01ea99aa-eb21-4799-9557-42c3fb55945a","Type":"ContainerStarted","Data":"5fa33257a5e5048798aca7a136a63ea6e6bfb6b1ba9b5a0a5be574c0d80f2544"} Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.391981 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" event={"ID":"b54e2237-603a-44ad-a129-04736cf749b2","Type":"ContainerStarted","Data":"fd38715546733b42ed8bcf4f24f786f982a201fe3ddb7aa977cdd779f8d33212"} Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.392480 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.393778 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" event={"ID":"c44e76b2-0de9-4a5b-93ee-536c6300157f","Type":"ContainerStarted","Data":"70296fbb3ad4f6ca38bff2fddbcb5d04ddb04dbeaadb8abe0f16a454b63045e1"} Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.394184 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.441336 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" podStartSLOduration=18.939661329 podStartE2EDuration="27.441224907s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:25.688174192 +0000 UTC m=+1060.938346100" lastFinishedPulling="2026-03-09 13:38:34.18973777 +0000 UTC m=+1069.439909678" observedRunningTime="2026-03-09 13:38:35.410969998 +0000 UTC m=+1070.661141906" watchObservedRunningTime="2026-03-09 13:38:35.441224907 +0000 UTC m=+1070.691396825" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.455282 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" podStartSLOduration=3.334425669 podStartE2EDuration="26.455249453s" podCreationTimestamp="2026-03-09 13:38:09 +0000 UTC" firstStartedPulling="2026-03-09 13:38:11.068812474 +0000 UTC m=+1046.318984372" lastFinishedPulling="2026-03-09 13:38:34.189636258 +0000 UTC m=+1069.439808156" observedRunningTime="2026-03-09 13:38:35.439072941 +0000 UTC m=+1070.689244859" watchObservedRunningTime="2026-03-09 13:38:35.455249453 +0000 UTC m=+1070.705421371" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.467199 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" podStartSLOduration=3.208447443 podStartE2EDuration="26.467156563s" podCreationTimestamp="2026-03-09 13:38:09 +0000 UTC" firstStartedPulling="2026-03-09 13:38:11.045030573 +0000 UTC m=+1046.295202481" lastFinishedPulling="2026-03-09 13:38:34.303739693 +0000 UTC m=+1069.553911601" observedRunningTime="2026-03-09 13:38:35.466384253 +0000 UTC m=+1070.716556171" watchObservedRunningTime="2026-03-09 13:38:35.467156563 +0000 UTC m=+1070.717328471" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.504637 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" podStartSLOduration=3.793507241 podStartE2EDuration="26.50460328s" podCreationTimestamp="2026-03-09 13:38:09 +0000 UTC" firstStartedPulling="2026-03-09 13:38:11.054512111 +0000 UTC m=+1046.304684019" lastFinishedPulling="2026-03-09 13:38:33.76560815 +0000 UTC m=+1069.015780058" observedRunningTime="2026-03-09 13:38:35.490977604 +0000 UTC m=+1070.741149522" watchObservedRunningTime="2026-03-09 13:38:35.50460328 +0000 UTC m=+1070.754775188" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.522019 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" podStartSLOduration=3.363851555 podStartE2EDuration="26.521999713s" podCreationTimestamp="2026-03-09 13:38:09 +0000 UTC" firstStartedPulling="2026-03-09 13:38:11.032339662 +0000 UTC m=+1046.282511570" lastFinishedPulling="2026-03-09 13:38:34.19048782 +0000 UTC m=+1069.440659728" observedRunningTime="2026-03-09 13:38:35.516182252 +0000 UTC m=+1070.766354160" watchObservedRunningTime="2026-03-09 13:38:35.521999713 +0000 UTC m=+1070.772171621" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.542972 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" podStartSLOduration=3.384121034 podStartE2EDuration="26.54295669s" podCreationTimestamp="2026-03-09 13:38:09 +0000 UTC" firstStartedPulling="2026-03-09 13:38:11.035370171 +0000 UTC m=+1046.285542079" lastFinishedPulling="2026-03-09 13:38:34.194205827 +0000 UTC m=+1069.444377735" observedRunningTime="2026-03-09 13:38:35.537569489 +0000 UTC m=+1070.787741397" watchObservedRunningTime="2026-03-09 13:38:35.54295669 +0000 UTC m=+1070.793128598" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.568103 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" podStartSLOduration=4.256144527 podStartE2EDuration="27.568070895s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.877362061 +0000 UTC m=+1046.127533969" lastFinishedPulling="2026-03-09 13:38:34.189288429 +0000 UTC m=+1069.439460337" observedRunningTime="2026-03-09 13:38:35.561285828 +0000 UTC m=+1070.811457736" watchObservedRunningTime="2026-03-09 13:38:35.568070895 +0000 UTC m=+1070.818242803" Mar 09 13:38:39 crc kubenswrapper[4764]: I0309 13:38:39.681051 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" Mar 09 13:38:39 crc kubenswrapper[4764]: I0309 13:38:39.864795 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" Mar 09 13:38:39 crc kubenswrapper[4764]: I0309 13:38:39.978801 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" Mar 09 13:38:40 crc kubenswrapper[4764]: I0309 13:38:40.003562 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" Mar 09 13:38:40 crc kubenswrapper[4764]: I0309 13:38:40.224539 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" Mar 09 13:38:41 crc kubenswrapper[4764]: I0309 13:38:41.387265 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:41 crc kubenswrapper[4764]: I0309 13:38:41.394950 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:41 crc kubenswrapper[4764]: I0309 13:38:41.444385 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:41 crc kubenswrapper[4764]: I0309 13:38:41.691907 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:41 crc kubenswrapper[4764]: I0309 13:38:41.692271 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:41 crc kubenswrapper[4764]: I0309 13:38:41.697105 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:41 crc kubenswrapper[4764]: I0309 13:38:41.697243 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:41 crc kubenswrapper[4764]: I0309 13:38:41.867941 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm"] Mar 09 13:38:41 crc kubenswrapper[4764]: W0309 13:38:41.870992 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47bd7072_a414_4ce8_800b_753b7054be23.slice/crio-f2d2c63ac396a240739b7e93660470611bb4046555c2f2ed2dcc9f8c30051d83 WatchSource:0}: Error finding container f2d2c63ac396a240739b7e93660470611bb4046555c2f2ed2dcc9f8c30051d83: Status 404 returned error can't find the container with id f2d2c63ac396a240739b7e93660470611bb4046555c2f2ed2dcc9f8c30051d83 Mar 09 13:38:41 crc kubenswrapper[4764]: I0309 13:38:41.899479 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:42 crc kubenswrapper[4764]: I0309 13:38:42.328959 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx"] Mar 09 13:38:42 crc kubenswrapper[4764]: W0309 13:38:42.347803 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode11f44d8_58a5_4fc7_b05b_e2e688647d01.slice/crio-9c79be0815ed12162432ecdd81215490c6a51b7f8a732124654ee50eaf6e69fc WatchSource:0}: Error finding container 9c79be0815ed12162432ecdd81215490c6a51b7f8a732124654ee50eaf6e69fc: Status 404 returned error can't find the container with id 9c79be0815ed12162432ecdd81215490c6a51b7f8a732124654ee50eaf6e69fc Mar 09 13:38:42 crc kubenswrapper[4764]: I0309 13:38:42.448627 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" event={"ID":"47bd7072-a414-4ce8-800b-753b7054be23","Type":"ContainerStarted","Data":"f2d2c63ac396a240739b7e93660470611bb4046555c2f2ed2dcc9f8c30051d83"} Mar 09 13:38:42 crc kubenswrapper[4764]: I0309 13:38:42.450231 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" event={"ID":"e11f44d8-58a5-4fc7-b05b-e2e688647d01","Type":"ContainerStarted","Data":"9c79be0815ed12162432ecdd81215490c6a51b7f8a732124654ee50eaf6e69fc"} Mar 09 13:38:42 crc kubenswrapper[4764]: I0309 13:38:42.452034 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" event={"ID":"32eb5815-c566-4177-8b47-f756807d4a30","Type":"ContainerStarted","Data":"ec192b101ade6b100e61556ec097b32f471e12e987e05ceaccafb3bb50ae0870"} Mar 09 13:38:42 crc kubenswrapper[4764]: I0309 13:38:42.452246 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" Mar 09 13:38:42 crc kubenswrapper[4764]: I0309 13:38:42.476779 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" podStartSLOduration=2.899313482 podStartE2EDuration="34.476759901s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.695247031 +0000 UTC m=+1045.945418939" lastFinishedPulling="2026-03-09 13:38:42.27269344 +0000 UTC m=+1077.522865358" observedRunningTime="2026-03-09 13:38:42.471055133 +0000 UTC m=+1077.721227051" watchObservedRunningTime="2026-03-09 13:38:42.476759901 +0000 UTC m=+1077.726931809" Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.462974 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" event={"ID":"488ff419-d889-4778-96cf-a11006c49507","Type":"ContainerStarted","Data":"97a74b6ae0cb2a171a7bd5d9948e05701136fa8d27cb2a79d7d1897aa7e69e4f"} Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.463396 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.464905 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" event={"ID":"e220a3f1-4dbe-4ee6-9b19-26985fa998cf","Type":"ContainerStarted","Data":"2bcd7da1fd872402812c12b48979d7c3ee03de5d7e45c47c01e79c5313334463"} Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.465351 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.468151 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" event={"ID":"e11f44d8-58a5-4fc7-b05b-e2e688647d01","Type":"ContainerStarted","Data":"fef489b0b95c2cd0698c79035bece550540b4ca1ec0dbd718ea9cb6e1b8bf684"} Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.468495 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.469985 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" event={"ID":"4c271ca0-0c25-46d1-b730-e94f68397e29","Type":"ContainerStarted","Data":"c3d0b6dfa252d650ad458c943102b7c5083ddce28606fe88f69dfee63b60abd4"} Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.470310 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.483338 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" podStartSLOduration=3.749033001 podStartE2EDuration="35.48332168s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.619208648 +0000 UTC m=+1045.869380556" lastFinishedPulling="2026-03-09 13:38:42.353497327 +0000 UTC m=+1077.603669235" observedRunningTime="2026-03-09 13:38:43.476387069 +0000 UTC m=+1078.726558977" watchObservedRunningTime="2026-03-09 13:38:43.48332168 +0000 UTC m=+1078.733493588" Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.502893 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" podStartSLOduration=2.994644228 podStartE2EDuration="35.50287792s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.064102362 +0000 UTC m=+1045.314274270" lastFinishedPulling="2026-03-09 13:38:42.572336054 +0000 UTC m=+1077.822507962" observedRunningTime="2026-03-09 13:38:43.500377715 +0000 UTC m=+1078.750549623" watchObservedRunningTime="2026-03-09 13:38:43.50287792 +0000 UTC m=+1078.753049818" Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.539900 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" podStartSLOduration=34.539882035 podStartE2EDuration="34.539882035s" podCreationTimestamp="2026-03-09 13:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:38:43.535200943 +0000 UTC m=+1078.785372871" watchObservedRunningTime="2026-03-09 13:38:43.539882035 +0000 UTC m=+1078.790053943" Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.561133 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" podStartSLOduration=3.16188448 podStartE2EDuration="35.561115819s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.171798181 +0000 UTC m=+1045.421970079" lastFinishedPulling="2026-03-09 13:38:42.5710295 +0000 UTC m=+1077.821201418" observedRunningTime="2026-03-09 13:38:43.556384536 +0000 UTC m=+1078.806556444" watchObservedRunningTime="2026-03-09 13:38:43.561115819 +0000 UTC m=+1078.811287727" Mar 09 13:38:44 crc kubenswrapper[4764]: I0309 13:38:44.924890 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:45 crc kubenswrapper[4764]: I0309 13:38:45.491613 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" event={"ID":"47bd7072-a414-4ce8-800b-753b7054be23","Type":"ContainerStarted","Data":"debf0a3ca63ffa338695e3592e5d4da09f8382e334d0971aa3fba5cab6bb3a08"} Mar 09 13:38:45 crc kubenswrapper[4764]: I0309 13:38:45.520947 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" podStartSLOduration=33.953253568 podStartE2EDuration="36.520914508s" podCreationTimestamp="2026-03-09 13:38:09 +0000 UTC" firstStartedPulling="2026-03-09 13:38:41.874147596 +0000 UTC m=+1077.124319514" lastFinishedPulling="2026-03-09 13:38:44.441808546 +0000 UTC m=+1079.691980454" observedRunningTime="2026-03-09 13:38:45.515086136 +0000 UTC m=+1080.765258044" watchObservedRunningTime="2026-03-09 13:38:45.520914508 +0000 UTC m=+1080.771086416" Mar 09 13:38:46 crc kubenswrapper[4764]: I0309 13:38:46.499612 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:48 crc kubenswrapper[4764]: I0309 13:38:48.981034 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" Mar 09 13:38:49 crc kubenswrapper[4764]: I0309 13:38:48.998376 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" Mar 09 13:38:49 crc kubenswrapper[4764]: I0309 13:38:49.109500 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" Mar 09 13:38:49 crc kubenswrapper[4764]: I0309 13:38:49.572564 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" Mar 09 13:38:49 crc kubenswrapper[4764]: I0309 13:38:49.748196 4764 scope.go:117] "RemoveContainer" containerID="492bc9f6bc85937ff3b8aee6d3f29e3e150f1bc426852bf9cdf6e5878286e321" Mar 09 13:38:51 crc kubenswrapper[4764]: I0309 13:38:51.451626 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:51 crc kubenswrapper[4764]: I0309 13:38:51.905701 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.017161 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggxj5"] Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.019712 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.022028 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.024380 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.024861 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-x8stb" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.029509 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.033941 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggxj5"] Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.144198 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79abb676-0322-4a75-90f5-743c942073b4-config\") pod \"dnsmasq-dns-675f4bcbfc-ggxj5\" (UID: \"79abb676-0322-4a75-90f5-743c942073b4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.144269 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzz2f\" (UniqueName: \"kubernetes.io/projected/79abb676-0322-4a75-90f5-743c942073b4-kube-api-access-wzz2f\") pod \"dnsmasq-dns-675f4bcbfc-ggxj5\" (UID: \"79abb676-0322-4a75-90f5-743c942073b4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.164218 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fcc4m"] Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.166078 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.169211 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.187826 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fcc4m"] Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.245734 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79abb676-0322-4a75-90f5-743c942073b4-config\") pod \"dnsmasq-dns-675f4bcbfc-ggxj5\" (UID: \"79abb676-0322-4a75-90f5-743c942073b4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.246595 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-config\") pod \"dnsmasq-dns-78dd6ddcc-fcc4m\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.246531 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79abb676-0322-4a75-90f5-743c942073b4-config\") pod \"dnsmasq-dns-675f4bcbfc-ggxj5\" (UID: \"79abb676-0322-4a75-90f5-743c942073b4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.246673 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzz2f\" (UniqueName: \"kubernetes.io/projected/79abb676-0322-4a75-90f5-743c942073b4-kube-api-access-wzz2f\") pod \"dnsmasq-dns-675f4bcbfc-ggxj5\" (UID: \"79abb676-0322-4a75-90f5-743c942073b4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.246768 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fcc4m\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.246812 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cbdt\" (UniqueName: \"kubernetes.io/projected/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-kube-api-access-8cbdt\") pod \"dnsmasq-dns-78dd6ddcc-fcc4m\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.272324 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzz2f\" (UniqueName: \"kubernetes.io/projected/79abb676-0322-4a75-90f5-743c942073b4-kube-api-access-wzz2f\") pod \"dnsmasq-dns-675f4bcbfc-ggxj5\" (UID: \"79abb676-0322-4a75-90f5-743c942073b4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.340137 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.349263 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fcc4m\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.349310 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cbdt\" (UniqueName: \"kubernetes.io/projected/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-kube-api-access-8cbdt\") pod \"dnsmasq-dns-78dd6ddcc-fcc4m\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.349376 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-config\") pod \"dnsmasq-dns-78dd6ddcc-fcc4m\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.350268 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-config\") pod \"dnsmasq-dns-78dd6ddcc-fcc4m\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.350373 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fcc4m\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.376419 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cbdt\" (UniqueName: \"kubernetes.io/projected/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-kube-api-access-8cbdt\") pod \"dnsmasq-dns-78dd6ddcc-fcc4m\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.487335 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.830376 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggxj5"] Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.898681 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fcc4m"] Mar 09 13:39:10 crc kubenswrapper[4764]: W0309 13:39:10.903222 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bf8d6e3_823d_4f28_b5dd_e4df591df8bd.slice/crio-790e722a1a82b199681b86b0375d2d82ee82392b6ce4b20994ef141a06eb2c50 WatchSource:0}: Error finding container 790e722a1a82b199681b86b0375d2d82ee82392b6ce4b20994ef141a06eb2c50: Status 404 returned error can't find the container with id 790e722a1a82b199681b86b0375d2d82ee82392b6ce4b20994ef141a06eb2c50 Mar 09 13:39:11 crc kubenswrapper[4764]: I0309 13:39:11.737255 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" event={"ID":"79abb676-0322-4a75-90f5-743c942073b4","Type":"ContainerStarted","Data":"689139240ebd2f6bf1f73c798b5e7965a7b610107fbcfeb6b865f8c4180b122b"} Mar 09 13:39:11 crc kubenswrapper[4764]: I0309 13:39:11.739188 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" event={"ID":"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd","Type":"ContainerStarted","Data":"790e722a1a82b199681b86b0375d2d82ee82392b6ce4b20994ef141a06eb2c50"} Mar 09 13:39:12 crc kubenswrapper[4764]: I0309 13:39:12.991278 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggxj5"] Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.012751 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nc2v8"] Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.014497 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.025671 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nc2v8"] Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.042001 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-config\") pod \"dnsmasq-dns-5ccc8479f9-nc2v8\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.042068 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-nc2v8\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.042236 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltsrn\" (UniqueName: \"kubernetes.io/projected/85351658-0136-4066-b39e-808260c4dae9-kube-api-access-ltsrn\") pod \"dnsmasq-dns-5ccc8479f9-nc2v8\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.143910 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-config\") pod \"dnsmasq-dns-5ccc8479f9-nc2v8\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.143975 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-nc2v8\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.144009 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltsrn\" (UniqueName: \"kubernetes.io/projected/85351658-0136-4066-b39e-808260c4dae9-kube-api-access-ltsrn\") pod \"dnsmasq-dns-5ccc8479f9-nc2v8\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.145481 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-config\") pod \"dnsmasq-dns-5ccc8479f9-nc2v8\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.146030 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-nc2v8\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.183393 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltsrn\" (UniqueName: \"kubernetes.io/projected/85351658-0136-4066-b39e-808260c4dae9-kube-api-access-ltsrn\") pod \"dnsmasq-dns-5ccc8479f9-nc2v8\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.337842 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fcc4m"] Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.349135 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.375443 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cwgfl"] Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.377209 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.394719 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cwgfl"] Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.452411 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dtxl\" (UniqueName: \"kubernetes.io/projected/04d22384-e765-4ac8-9afa-7a31f4c347b2-kube-api-access-7dtxl\") pod \"dnsmasq-dns-57d769cc4f-cwgfl\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.452494 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-config\") pod \"dnsmasq-dns-57d769cc4f-cwgfl\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.452571 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-cwgfl\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.555193 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dtxl\" (UniqueName: \"kubernetes.io/projected/04d22384-e765-4ac8-9afa-7a31f4c347b2-kube-api-access-7dtxl\") pod \"dnsmasq-dns-57d769cc4f-cwgfl\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.555269 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-config\") pod \"dnsmasq-dns-57d769cc4f-cwgfl\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.555347 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-cwgfl\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.556409 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-cwgfl\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.557282 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-config\") pod \"dnsmasq-dns-57d769cc4f-cwgfl\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.575902 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dtxl\" (UniqueName: \"kubernetes.io/projected/04d22384-e765-4ac8-9afa-7a31f4c347b2-kube-api-access-7dtxl\") pod \"dnsmasq-dns-57d769cc4f-cwgfl\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.783030 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.813040 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nc2v8"] Mar 09 13:39:13 crc kubenswrapper[4764]: W0309 13:39:13.819915 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85351658_0136_4066_b39e_808260c4dae9.slice/crio-cf38f50edd1cd803b2485a43d9bb0cf84c48deb7190f61ca1abae1471ed84bf4 WatchSource:0}: Error finding container cf38f50edd1cd803b2485a43d9bb0cf84c48deb7190f61ca1abae1471ed84bf4: Status 404 returned error can't find the container with id cf38f50edd1cd803b2485a43d9bb0cf84c48deb7190f61ca1abae1471ed84bf4 Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.120432 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cwgfl"] Mar 09 13:39:14 crc kubenswrapper[4764]: W0309 13:39:14.124607 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04d22384_e765_4ac8_9afa_7a31f4c347b2.slice/crio-e9eba66ff4604aeb6e4bde7029daea515e55ea3fe70bfdcf16702550cc3774f9 WatchSource:0}: Error finding container e9eba66ff4604aeb6e4bde7029daea515e55ea3fe70bfdcf16702550cc3774f9: Status 404 returned error can't find the container with id e9eba66ff4604aeb6e4bde7029daea515e55ea3fe70bfdcf16702550cc3774f9 Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.168753 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.170255 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.173269 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.173491 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.173550 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.174673 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6m67z" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.175877 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.177926 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.180754 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.187672 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.375227 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.375335 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.375399 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.375424 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.375463 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.375485 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.375512 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.375548 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.375582 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.375900 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.376037 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bzh8\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-kube-api-access-4bzh8\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477450 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477552 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477577 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477619 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477655 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477682 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477714 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477760 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477804 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477844 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bzh8\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-kube-api-access-4bzh8\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477905 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.478326 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.479050 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.479506 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.487750 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.487964 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.488800 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.493152 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.509025 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bzh8\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-kube-api-access-4bzh8\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.510789 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.513135 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.513548 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.514444 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.515830 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.519031 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.527196 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.527423 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.527571 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.532583 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.532912 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.534143 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8dlbf" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.534297 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.539166 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.681820 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/507bcef1-e9ef-4eb1-85ce-358209b944bc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.681895 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/507bcef1-e9ef-4eb1-85ce-358209b944bc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.681932 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-config-data\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.681966 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.682241 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.682335 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w75z\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-kube-api-access-2w75z\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.682507 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.682602 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.684040 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.684113 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.684157 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.768756 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" event={"ID":"85351658-0136-4066-b39e-808260c4dae9","Type":"ContainerStarted","Data":"cf38f50edd1cd803b2485a43d9bb0cf84c48deb7190f61ca1abae1471ed84bf4"} Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.770969 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" event={"ID":"04d22384-e765-4ac8-9afa-7a31f4c347b2","Type":"ContainerStarted","Data":"e9eba66ff4604aeb6e4bde7029daea515e55ea3fe70bfdcf16702550cc3774f9"} Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785571 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785663 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785748 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785780 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785802 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785825 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/507bcef1-e9ef-4eb1-85ce-358209b944bc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785846 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/507bcef1-e9ef-4eb1-85ce-358209b944bc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785872 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-config-data\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785893 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785911 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785934 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w75z\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-kube-api-access-2w75z\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.786017 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.786267 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.786563 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.787637 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.787825 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-config-data\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.788105 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.794932 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.795759 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/507bcef1-e9ef-4eb1-85ce-358209b944bc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.796124 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/507bcef1-e9ef-4eb1-85ce-358209b944bc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.805441 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w75z\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-kube-api-access-2w75z\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.821048 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.827981 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.829963 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.905516 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.604238 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.606451 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.615747 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.616304 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.616369 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-8dgkg" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.621488 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.627947 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.629063 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.711161 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c87ed75-4285-4084-bce3-ee8dba7671c0-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.711351 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c87ed75-4285-4084-bce3-ee8dba7671c0-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.711617 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.711694 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0c87ed75-4285-4084-bce3-ee8dba7671c0-config-data-default\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.711802 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c57n\" (UniqueName: \"kubernetes.io/projected/0c87ed75-4285-4084-bce3-ee8dba7671c0-kube-api-access-4c57n\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.712003 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c87ed75-4285-4084-bce3-ee8dba7671c0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.712070 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0c87ed75-4285-4084-bce3-ee8dba7671c0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.712102 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0c87ed75-4285-4084-bce3-ee8dba7671c0-kolla-config\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.815531 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c57n\" (UniqueName: \"kubernetes.io/projected/0c87ed75-4285-4084-bce3-ee8dba7671c0-kube-api-access-4c57n\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.815626 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c87ed75-4285-4084-bce3-ee8dba7671c0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.815672 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0c87ed75-4285-4084-bce3-ee8dba7671c0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.815691 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0c87ed75-4285-4084-bce3-ee8dba7671c0-kolla-config\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.815722 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c87ed75-4285-4084-bce3-ee8dba7671c0-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.815776 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c87ed75-4285-4084-bce3-ee8dba7671c0-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.815811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.815831 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0c87ed75-4285-4084-bce3-ee8dba7671c0-config-data-default\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.817866 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c87ed75-4285-4084-bce3-ee8dba7671c0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.817961 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0c87ed75-4285-4084-bce3-ee8dba7671c0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.818011 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.818275 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0c87ed75-4285-4084-bce3-ee8dba7671c0-config-data-default\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.825222 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0c87ed75-4285-4084-bce3-ee8dba7671c0-kolla-config\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.841894 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c87ed75-4285-4084-bce3-ee8dba7671c0-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.848820 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c57n\" (UniqueName: \"kubernetes.io/projected/0c87ed75-4285-4084-bce3-ee8dba7671c0-kube-api-access-4c57n\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.856053 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c87ed75-4285-4084-bce3-ee8dba7671c0-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.890394 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.940911 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 09 13:39:16 crc kubenswrapper[4764]: I0309 13:39:16.956431 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 13:39:16 crc kubenswrapper[4764]: I0309 13:39:16.958178 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:16 crc kubenswrapper[4764]: I0309 13:39:16.961928 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 09 13:39:16 crc kubenswrapper[4764]: I0309 13:39:16.963098 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 09 13:39:16 crc kubenswrapper[4764]: I0309 13:39:16.963442 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 09 13:39:16 crc kubenswrapper[4764]: I0309 13:39:16.963601 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-cw24b" Mar 09 13:39:16 crc kubenswrapper[4764]: I0309 13:39:16.972441 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.065273 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/103cd40b-aa84-4973-8e47-8a67e5994c80-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.065320 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.065344 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/103cd40b-aa84-4973-8e47-8a67e5994c80-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.065383 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103cd40b-aa84-4973-8e47-8a67e5994c80-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.065411 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/103cd40b-aa84-4973-8e47-8a67e5994c80-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.065445 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/103cd40b-aa84-4973-8e47-8a67e5994c80-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.065487 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/103cd40b-aa84-4973-8e47-8a67e5994c80-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.065511 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stfnd\" (UniqueName: \"kubernetes.io/projected/103cd40b-aa84-4973-8e47-8a67e5994c80-kube-api-access-stfnd\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.168068 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/103cd40b-aa84-4973-8e47-8a67e5994c80-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.168131 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.168157 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/103cd40b-aa84-4973-8e47-8a67e5994c80-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.168191 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103cd40b-aa84-4973-8e47-8a67e5994c80-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.168226 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/103cd40b-aa84-4973-8e47-8a67e5994c80-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.168260 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/103cd40b-aa84-4973-8e47-8a67e5994c80-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.168298 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/103cd40b-aa84-4973-8e47-8a67e5994c80-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.168323 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stfnd\" (UniqueName: \"kubernetes.io/projected/103cd40b-aa84-4973-8e47-8a67e5994c80-kube-api-access-stfnd\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.170306 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/103cd40b-aa84-4973-8e47-8a67e5994c80-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.170435 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.171302 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/103cd40b-aa84-4973-8e47-8a67e5994c80-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.172016 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/103cd40b-aa84-4973-8e47-8a67e5994c80-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.174695 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/103cd40b-aa84-4973-8e47-8a67e5994c80-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.181393 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/103cd40b-aa84-4973-8e47-8a67e5994c80-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.181515 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103cd40b-aa84-4973-8e47-8a67e5994c80-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.208342 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.211308 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stfnd\" (UniqueName: \"kubernetes.io/projected/103cd40b-aa84-4973-8e47-8a67e5994c80-kube-api-access-stfnd\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.295327 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.413900 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.415011 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.419953 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.420239 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-69l42" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.420415 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.463772 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.576738 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/519ac270-ea24-47c1-b4f3-d94b0add96d1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.576787 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tpjx\" (UniqueName: \"kubernetes.io/projected/519ac270-ea24-47c1-b4f3-d94b0add96d1-kube-api-access-7tpjx\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.576844 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/519ac270-ea24-47c1-b4f3-d94b0add96d1-kolla-config\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.576935 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/519ac270-ea24-47c1-b4f3-d94b0add96d1-config-data\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.576966 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/519ac270-ea24-47c1-b4f3-d94b0add96d1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.678519 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/519ac270-ea24-47c1-b4f3-d94b0add96d1-kolla-config\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.678638 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/519ac270-ea24-47c1-b4f3-d94b0add96d1-config-data\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.678703 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/519ac270-ea24-47c1-b4f3-d94b0add96d1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.678777 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/519ac270-ea24-47c1-b4f3-d94b0add96d1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.678804 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tpjx\" (UniqueName: \"kubernetes.io/projected/519ac270-ea24-47c1-b4f3-d94b0add96d1-kube-api-access-7tpjx\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.680096 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/519ac270-ea24-47c1-b4f3-d94b0add96d1-config-data\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.681034 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/519ac270-ea24-47c1-b4f3-d94b0add96d1-kolla-config\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.682145 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/519ac270-ea24-47c1-b4f3-d94b0add96d1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.688351 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/519ac270-ea24-47c1-b4f3-d94b0add96d1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.701168 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tpjx\" (UniqueName: \"kubernetes.io/projected/519ac270-ea24-47c1-b4f3-d94b0add96d1-kube-api-access-7tpjx\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.750468 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 09 13:39:19 crc kubenswrapper[4764]: I0309 13:39:19.747294 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:39:19 crc kubenswrapper[4764]: I0309 13:39:19.751062 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 13:39:19 crc kubenswrapper[4764]: I0309 13:39:19.754525 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-zzhlv" Mar 09 13:39:19 crc kubenswrapper[4764]: I0309 13:39:19.759500 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:39:19 crc kubenswrapper[4764]: I0309 13:39:19.837481 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gphmn\" (UniqueName: \"kubernetes.io/projected/ce660994-4427-4d54-b83c-9c9ec7f64a9d-kube-api-access-gphmn\") pod \"kube-state-metrics-0\" (UID: \"ce660994-4427-4d54-b83c-9c9ec7f64a9d\") " pod="openstack/kube-state-metrics-0" Mar 09 13:39:19 crc kubenswrapper[4764]: I0309 13:39:19.939546 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gphmn\" (UniqueName: \"kubernetes.io/projected/ce660994-4427-4d54-b83c-9c9ec7f64a9d-kube-api-access-gphmn\") pod \"kube-state-metrics-0\" (UID: \"ce660994-4427-4d54-b83c-9c9ec7f64a9d\") " pod="openstack/kube-state-metrics-0" Mar 09 13:39:19 crc kubenswrapper[4764]: I0309 13:39:19.973456 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gphmn\" (UniqueName: \"kubernetes.io/projected/ce660994-4427-4d54-b83c-9c9ec7f64a9d-kube-api-access-gphmn\") pod \"kube-state-metrics-0\" (UID: \"ce660994-4427-4d54-b83c-9c9ec7f64a9d\") " pod="openstack/kube-state-metrics-0" Mar 09 13:39:20 crc kubenswrapper[4764]: I0309 13:39:20.082889 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.272094 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-qm7vs"] Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.275702 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.278216 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-dwvcr" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.278823 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.280439 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.289160 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-2zkzm"] Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.290856 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.306618 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qm7vs"] Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.316432 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2zkzm"] Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.417892 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9bbe03cf-76d5-440a-903f-50c382aa3a4e-var-log-ovn\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.418085 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9bbe03cf-76d5-440a-903f-50c382aa3a4e-var-run\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.418265 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-var-lib\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.418471 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8vwf\" (UniqueName: \"kubernetes.io/projected/9bbe03cf-76d5-440a-903f-50c382aa3a4e-kube-api-access-p8vwf\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.418579 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-var-run\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.418621 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bbe03cf-76d5-440a-903f-50c382aa3a4e-ovn-controller-tls-certs\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.418868 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbe03cf-76d5-440a-903f-50c382aa3a4e-combined-ca-bundle\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.418975 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bbe03cf-76d5-440a-903f-50c382aa3a4e-var-run-ovn\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.419339 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bbe03cf-76d5-440a-903f-50c382aa3a4e-scripts\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.419443 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05f9485e-b683-481d-87d3-fb86ebb4a832-scripts\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.419528 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-var-log\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.419606 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v89d9\" (UniqueName: \"kubernetes.io/projected/05f9485e-b683-481d-87d3-fb86ebb4a832-kube-api-access-v89d9\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.419693 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-etc-ovs\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521346 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9bbe03cf-76d5-440a-903f-50c382aa3a4e-var-run\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521403 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-var-lib\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521438 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8vwf\" (UniqueName: \"kubernetes.io/projected/9bbe03cf-76d5-440a-903f-50c382aa3a4e-kube-api-access-p8vwf\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521470 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-var-run\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521492 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bbe03cf-76d5-440a-903f-50c382aa3a4e-ovn-controller-tls-certs\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521519 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbe03cf-76d5-440a-903f-50c382aa3a4e-combined-ca-bundle\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521549 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bbe03cf-76d5-440a-903f-50c382aa3a4e-var-run-ovn\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521589 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bbe03cf-76d5-440a-903f-50c382aa3a4e-scripts\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521616 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05f9485e-b683-481d-87d3-fb86ebb4a832-scripts\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521669 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-var-log\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521696 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v89d9\" (UniqueName: \"kubernetes.io/projected/05f9485e-b683-481d-87d3-fb86ebb4a832-kube-api-access-v89d9\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521727 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-etc-ovs\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521774 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9bbe03cf-76d5-440a-903f-50c382aa3a4e-var-log-ovn\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521980 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9bbe03cf-76d5-440a-903f-50c382aa3a4e-var-run\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.522018 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9bbe03cf-76d5-440a-903f-50c382aa3a4e-var-log-ovn\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.522126 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-var-lib\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.522145 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-var-log\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.522198 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bbe03cf-76d5-440a-903f-50c382aa3a4e-var-run-ovn\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.522350 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-var-run\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.522512 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-etc-ovs\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.526906 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bbe03cf-76d5-440a-903f-50c382aa3a4e-scripts\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.526953 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05f9485e-b683-481d-87d3-fb86ebb4a832-scripts\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.528425 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbe03cf-76d5-440a-903f-50c382aa3a4e-combined-ca-bundle\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.528490 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bbe03cf-76d5-440a-903f-50c382aa3a4e-ovn-controller-tls-certs\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.548916 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v89d9\" (UniqueName: \"kubernetes.io/projected/05f9485e-b683-481d-87d3-fb86ebb4a832-kube-api-access-v89d9\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.553335 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8vwf\" (UniqueName: \"kubernetes.io/projected/9bbe03cf-76d5-440a-903f-50c382aa3a4e-kube-api-access-p8vwf\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.609430 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.621269 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.140910 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.143873 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.148435 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.150316 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-krxnp" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.150602 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.150840 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.151020 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.154861 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.239813 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.239983 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.240042 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwcwk\" (UniqueName: \"kubernetes.io/projected/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-kube-api-access-lwcwk\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.240197 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-config\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.240241 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.240320 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.240452 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.240615 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.342553 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.342631 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.343098 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.343156 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.343181 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwcwk\" (UniqueName: \"kubernetes.io/projected/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-kube-api-access-lwcwk\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.343244 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-config\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.343298 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.343354 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.343510 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.344803 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.344952 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-config\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.346881 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.350049 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.350577 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.352639 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.365854 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwcwk\" (UniqueName: \"kubernetes.io/projected/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-kube-api-access-lwcwk\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.370070 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.470845 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.334674 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.337119 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.344798 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.345163 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.345221 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-gcsfc" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.345365 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.356023 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.490408 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/047aa387-9e35-4ec6-89a9-3be60e47610b-config\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.490469 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7286\" (UniqueName: \"kubernetes.io/projected/047aa387-9e35-4ec6-89a9-3be60e47610b-kube-api-access-w7286\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.490498 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/047aa387-9e35-4ec6-89a9-3be60e47610b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.490554 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/047aa387-9e35-4ec6-89a9-3be60e47610b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.490576 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.490623 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/047aa387-9e35-4ec6-89a9-3be60e47610b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.490722 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047aa387-9e35-4ec6-89a9-3be60e47610b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.490782 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/047aa387-9e35-4ec6-89a9-3be60e47610b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.593913 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/047aa387-9e35-4ec6-89a9-3be60e47610b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.594050 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/047aa387-9e35-4ec6-89a9-3be60e47610b-config\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.594136 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7286\" (UniqueName: \"kubernetes.io/projected/047aa387-9e35-4ec6-89a9-3be60e47610b-kube-api-access-w7286\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.594161 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/047aa387-9e35-4ec6-89a9-3be60e47610b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.594233 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/047aa387-9e35-4ec6-89a9-3be60e47610b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.594255 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.594330 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/047aa387-9e35-4ec6-89a9-3be60e47610b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.594372 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047aa387-9e35-4ec6-89a9-3be60e47610b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.596883 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.597468 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/047aa387-9e35-4ec6-89a9-3be60e47610b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.597547 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/047aa387-9e35-4ec6-89a9-3be60e47610b-config\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.597840 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/047aa387-9e35-4ec6-89a9-3be60e47610b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.600435 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/047aa387-9e35-4ec6-89a9-3be60e47610b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.601085 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/047aa387-9e35-4ec6-89a9-3be60e47610b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.601803 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047aa387-9e35-4ec6-89a9-3be60e47610b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.616231 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7286\" (UniqueName: \"kubernetes.io/projected/047aa387-9e35-4ec6-89a9-3be60e47610b-kube-api-access-w7286\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.631078 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.671943 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.354366 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.354917 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8cbdt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-fcc4m_openstack(6bf8d6e3-823d-4f28-b5dd-e4df591df8bd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.356321 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" podUID="6bf8d6e3-823d-4f28-b5dd-e4df591df8bd" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.387628 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.387855 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ltsrn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-nc2v8_openstack(85351658-0136-4066-b39e-808260c4dae9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.388956 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" podUID="85351658-0136-4066-b39e-808260c4dae9" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.423783 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.423972 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wzz2f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-ggxj5_openstack(79abb676-0322-4a75-90f5-743c942073b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.425186 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" podUID="79abb676-0322-4a75-90f5-743c942073b4" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.452820 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.452994 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7dtxl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-cwgfl_openstack(04d22384-e765-4ac8-9afa-7a31f4c347b2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.454276 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" podUID="04d22384-e765-4ac8-9afa-7a31f4c347b2" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.949160 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" podUID="04d22384-e765-4ac8-9afa-7a31f4c347b2" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.949607 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" podUID="85351658-0136-4066-b39e-808260c4dae9" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.042437 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.262964 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.276685 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.297785 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.337404 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.423479 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.470516 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qm7vs"] Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.471546 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.485661 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.487736 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.517275 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2zkzm"] Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.587955 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-config\") pod \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.588523 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cbdt\" (UniqueName: \"kubernetes.io/projected/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-kube-api-access-8cbdt\") pod \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.588583 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-config" (OuterVolumeSpecName: "config") pod "6bf8d6e3-823d-4f28-b5dd-e4df591df8bd" (UID: "6bf8d6e3-823d-4f28-b5dd-e4df591df8bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.589812 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-dns-svc\") pod \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.589909 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79abb676-0322-4a75-90f5-743c942073b4-config\") pod \"79abb676-0322-4a75-90f5-743c942073b4\" (UID: \"79abb676-0322-4a75-90f5-743c942073b4\") " Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.589949 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzz2f\" (UniqueName: \"kubernetes.io/projected/79abb676-0322-4a75-90f5-743c942073b4-kube-api-access-wzz2f\") pod \"79abb676-0322-4a75-90f5-743c942073b4\" (UID: \"79abb676-0322-4a75-90f5-743c942073b4\") " Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.590356 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6bf8d6e3-823d-4f28-b5dd-e4df591df8bd" (UID: "6bf8d6e3-823d-4f28-b5dd-e4df591df8bd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.590685 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.590705 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.590619 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79abb676-0322-4a75-90f5-743c942073b4-config" (OuterVolumeSpecName: "config") pod "79abb676-0322-4a75-90f5-743c942073b4" (UID: "79abb676-0322-4a75-90f5-743c942073b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.595360 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79abb676-0322-4a75-90f5-743c942073b4-kube-api-access-wzz2f" (OuterVolumeSpecName: "kube-api-access-wzz2f") pod "79abb676-0322-4a75-90f5-743c942073b4" (UID: "79abb676-0322-4a75-90f5-743c942073b4"). InnerVolumeSpecName "kube-api-access-wzz2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.595413 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-kube-api-access-8cbdt" (OuterVolumeSpecName: "kube-api-access-8cbdt") pod "6bf8d6e3-823d-4f28-b5dd-e4df591df8bd" (UID: "6bf8d6e3-823d-4f28-b5dd-e4df591df8bd"). InnerVolumeSpecName "kube-api-access-8cbdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.692456 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cbdt\" (UniqueName: \"kubernetes.io/projected/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-kube-api-access-8cbdt\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.692498 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79abb676-0322-4a75-90f5-743c942073b4-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.692514 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzz2f\" (UniqueName: \"kubernetes.io/projected/79abb676-0322-4a75-90f5-743c942073b4-kube-api-access-wzz2f\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.957616 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"519ac270-ea24-47c1-b4f3-d94b0add96d1","Type":"ContainerStarted","Data":"8b96065c71ce956d7ca9e3f9b0dc1039ec60f27e5692b9eabf4e01ffb71c54e6"} Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.962451 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" event={"ID":"79abb676-0322-4a75-90f5-743c942073b4","Type":"ContainerDied","Data":"689139240ebd2f6bf1f73c798b5e7965a7b610107fbcfeb6b865f8c4180b122b"} Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.962463 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.965189 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"103cd40b-aa84-4973-8e47-8a67e5994c80","Type":"ContainerStarted","Data":"d4896fdeaab9e0ab6795bf0feabe6e710312aa60d40358c1313f6f61fd1384b8"} Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.968134 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"11c8bf9f-a031-4e56-b1d7-49b407eabaf7","Type":"ContainerStarted","Data":"b97c921b54e1f12956d845171f6d90fe64a80d32c024a23960cca4b47667dc15"} Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.970797 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.970797 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" event={"ID":"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd","Type":"ContainerDied","Data":"790e722a1a82b199681b86b0375d2d82ee82392b6ce4b20994ef141a06eb2c50"} Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.973432 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ce660994-4427-4d54-b83c-9c9ec7f64a9d","Type":"ContainerStarted","Data":"b8a8570b31f5838a5c44b32bc16eef1004cf79cfc6a6ee8f31255abe6b100221"} Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.983118 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"047aa387-9e35-4ec6-89a9-3be60e47610b","Type":"ContainerStarted","Data":"73534d7403f4d4b8a3d758d05ae643c12076d7d324d41d9ae662fa58e1d331bb"} Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.985608 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2zkzm" event={"ID":"05f9485e-b683-481d-87d3-fb86ebb4a832","Type":"ContainerStarted","Data":"0c9065ceb2c5b59137c3a5f22de500855f80722abaf07ce360f91a72b093f0aa"} Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.986672 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0c87ed75-4285-4084-bce3-ee8dba7671c0","Type":"ContainerStarted","Data":"cd15c32a15b75663d44df8d62c33aca309dbb77a08f8075368a95588ab9bf712"} Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.991613 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"507bcef1-e9ef-4eb1-85ce-358209b944bc","Type":"ContainerStarted","Data":"ee9809e2cf751402688e9f6828a75759ba83ac17c29d13b65aa1aa2a2afdc207"} Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.992894 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qm7vs" event={"ID":"9bbe03cf-76d5-440a-903f-50c382aa3a4e","Type":"ContainerStarted","Data":"5868a99ec8e1266049239845f3b47de2af15a5d3b1e44885fe764fb14f411078"} Mar 09 13:39:31 crc kubenswrapper[4764]: I0309 13:39:31.037120 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggxj5"] Mar 09 13:39:31 crc kubenswrapper[4764]: I0309 13:39:31.054095 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggxj5"] Mar 09 13:39:31 crc kubenswrapper[4764]: I0309 13:39:31.079041 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fcc4m"] Mar 09 13:39:31 crc kubenswrapper[4764]: I0309 13:39:31.086148 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fcc4m"] Mar 09 13:39:31 crc kubenswrapper[4764]: I0309 13:39:31.585489 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bf8d6e3-823d-4f28-b5dd-e4df591df8bd" path="/var/lib/kubelet/pods/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd/volumes" Mar 09 13:39:31 crc kubenswrapper[4764]: I0309 13:39:31.586317 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79abb676-0322-4a75-90f5-743c942073b4" path="/var/lib/kubelet/pods/79abb676-0322-4a75-90f5-743c942073b4/volumes" Mar 09 13:39:31 crc kubenswrapper[4764]: I0309 13:39:31.605353 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 13:39:33 crc kubenswrapper[4764]: W0309 13:39:33.812148 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode54bd06b_1ee2_452d_80fb_12fd4fb61c7b.slice/crio-587d898cabcd704974d0b76f839425d05f6d9c4e36b24c70baeee270aea0c8dc WatchSource:0}: Error finding container 587d898cabcd704974d0b76f839425d05f6d9c4e36b24c70baeee270aea0c8dc: Status 404 returned error can't find the container with id 587d898cabcd704974d0b76f839425d05f6d9c4e36b24c70baeee270aea0c8dc Mar 09 13:39:34 crc kubenswrapper[4764]: I0309 13:39:34.018945 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b","Type":"ContainerStarted","Data":"587d898cabcd704974d0b76f839425d05f6d9c4e36b24c70baeee270aea0c8dc"} Mar 09 13:39:39 crc kubenswrapper[4764]: I0309 13:39:39.075741 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"519ac270-ea24-47c1-b4f3-d94b0add96d1","Type":"ContainerStarted","Data":"857b2ed5ce566d3011852236c58b722132628d1f16a978cb03c1aa03305707b9"} Mar 09 13:39:39 crc kubenswrapper[4764]: I0309 13:39:39.076488 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 09 13:39:39 crc kubenswrapper[4764]: I0309 13:39:39.077206 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0c87ed75-4285-4084-bce3-ee8dba7671c0","Type":"ContainerStarted","Data":"ee6313eb1e523335aac7fc5b10eeb2f3e2a30c9ec3b48d04c221cffc105acf19"} Mar 09 13:39:39 crc kubenswrapper[4764]: I0309 13:39:39.117440 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.756977368 podStartE2EDuration="22.117418617s" podCreationTimestamp="2026-03-09 13:39:17 +0000 UTC" firstStartedPulling="2026-03-09 13:39:30.286180492 +0000 UTC m=+1125.536352400" lastFinishedPulling="2026-03-09 13:39:37.646621741 +0000 UTC m=+1132.896793649" observedRunningTime="2026-03-09 13:39:39.100064455 +0000 UTC m=+1134.350236363" watchObservedRunningTime="2026-03-09 13:39:39.117418617 +0000 UTC m=+1134.367590535" Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.095585 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qm7vs" event={"ID":"9bbe03cf-76d5-440a-903f-50c382aa3a4e","Type":"ContainerStarted","Data":"7780fec5f70c7a0c4ccec4952ccb60908acc7914c6f1cd488d5262ad3acf8061"} Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.096203 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.099826 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ce660994-4427-4d54-b83c-9c9ec7f64a9d","Type":"ContainerStarted","Data":"9847def4972a082de5de5ce66af1aea1d5c2f1147d6cb0e73ffadb728f7879a5"} Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.099978 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.104607 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"103cd40b-aa84-4973-8e47-8a67e5994c80","Type":"ContainerStarted","Data":"de2693cfb592190f100e10043f32ff98b97a17851959b75e973a39d89f3be6c8"} Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.108631 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"047aa387-9e35-4ec6-89a9-3be60e47610b","Type":"ContainerStarted","Data":"b40ecda29016254d1a36a05d5e6839eebe91e78d5ca273840367053a0ddf1f53"} Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.113712 4764 generic.go:334] "Generic (PLEG): container finished" podID="05f9485e-b683-481d-87d3-fb86ebb4a832" containerID="2b0a7ff734ca1505391a107d091f7de45231761d7a49f25ccee267462bd84773" exitCode=0 Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.115091 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2zkzm" event={"ID":"05f9485e-b683-481d-87d3-fb86ebb4a832","Type":"ContainerDied","Data":"2b0a7ff734ca1505391a107d091f7de45231761d7a49f25ccee267462bd84773"} Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.117994 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b","Type":"ContainerStarted","Data":"e3787b08ede6c8e8ef08219c9e0ba6cf1a7c8bd5c292aead18b3f917f6e8d996"} Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.118698 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-qm7vs" podStartSLOduration=9.481128873 podStartE2EDuration="17.118676299s" podCreationTimestamp="2026-03-09 13:39:23 +0000 UTC" firstStartedPulling="2026-03-09 13:39:30.491988179 +0000 UTC m=+1125.742160087" lastFinishedPulling="2026-03-09 13:39:38.129535605 +0000 UTC m=+1133.379707513" observedRunningTime="2026-03-09 13:39:40.116353798 +0000 UTC m=+1135.366525706" watchObservedRunningTime="2026-03-09 13:39:40.118676299 +0000 UTC m=+1135.368848207" Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.190598 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=13.088438162 podStartE2EDuration="21.190574274s" podCreationTimestamp="2026-03-09 13:39:19 +0000 UTC" firstStartedPulling="2026-03-09 13:39:30.510095301 +0000 UTC m=+1125.760267209" lastFinishedPulling="2026-03-09 13:39:38.612231413 +0000 UTC m=+1133.862403321" observedRunningTime="2026-03-09 13:39:40.183108199 +0000 UTC m=+1135.433280117" watchObservedRunningTime="2026-03-09 13:39:40.190574274 +0000 UTC m=+1135.440746182" Mar 09 13:39:41 crc kubenswrapper[4764]: I0309 13:39:41.131851 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"507bcef1-e9ef-4eb1-85ce-358209b944bc","Type":"ContainerStarted","Data":"f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601"} Mar 09 13:39:41 crc kubenswrapper[4764]: I0309 13:39:41.143302 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"11c8bf9f-a031-4e56-b1d7-49b407eabaf7","Type":"ContainerStarted","Data":"fd0a79b758702c401b2bbc87884a5f0ad37053c07ad4da0c2c69610d9e0509c2"} Mar 09 13:39:41 crc kubenswrapper[4764]: I0309 13:39:41.146758 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2zkzm" event={"ID":"05f9485e-b683-481d-87d3-fb86ebb4a832","Type":"ContainerStarted","Data":"6d06d27d33c7238bf1afccbff4afe123a96246645b51d5bfa4856ce504137597"} Mar 09 13:39:41 crc kubenswrapper[4764]: I0309 13:39:41.146787 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2zkzm" event={"ID":"05f9485e-b683-481d-87d3-fb86ebb4a832","Type":"ContainerStarted","Data":"2c3a94d19625a148abe11e06ba10b6b5dc33c7a969c78936633b8e6bacf89e8a"} Mar 09 13:39:41 crc kubenswrapper[4764]: I0309 13:39:41.222245 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-2zkzm" podStartSLOduration=11.103769899 podStartE2EDuration="18.222216267s" podCreationTimestamp="2026-03-09 13:39:23 +0000 UTC" firstStartedPulling="2026-03-09 13:39:30.530427762 +0000 UTC m=+1125.780599670" lastFinishedPulling="2026-03-09 13:39:37.64887413 +0000 UTC m=+1132.899046038" observedRunningTime="2026-03-09 13:39:41.218752546 +0000 UTC m=+1136.468924464" watchObservedRunningTime="2026-03-09 13:39:41.222216267 +0000 UTC m=+1136.472388195" Mar 09 13:39:42 crc kubenswrapper[4764]: I0309 13:39:42.160733 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:42 crc kubenswrapper[4764]: I0309 13:39:42.161092 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:44 crc kubenswrapper[4764]: I0309 13:39:44.184992 4764 generic.go:334] "Generic (PLEG): container finished" podID="103cd40b-aa84-4973-8e47-8a67e5994c80" containerID="de2693cfb592190f100e10043f32ff98b97a17851959b75e973a39d89f3be6c8" exitCode=0 Mar 09 13:39:44 crc kubenswrapper[4764]: I0309 13:39:44.185096 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"103cd40b-aa84-4973-8e47-8a67e5994c80","Type":"ContainerDied","Data":"de2693cfb592190f100e10043f32ff98b97a17851959b75e973a39d89f3be6c8"} Mar 09 13:39:44 crc kubenswrapper[4764]: I0309 13:39:44.187694 4764 generic.go:334] "Generic (PLEG): container finished" podID="0c87ed75-4285-4084-bce3-ee8dba7671c0" containerID="ee6313eb1e523335aac7fc5b10eeb2f3e2a30c9ec3b48d04c221cffc105acf19" exitCode=0 Mar 09 13:39:44 crc kubenswrapper[4764]: I0309 13:39:44.187727 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0c87ed75-4285-4084-bce3-ee8dba7671c0","Type":"ContainerDied","Data":"ee6313eb1e523335aac7fc5b10eeb2f3e2a30c9ec3b48d04c221cffc105acf19"} Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.693634 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-8ctgr"] Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.695423 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.697826 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.708462 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8ctgr"] Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.832352 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4db14a6b-d372-48be-86a1-bf651618b4a4-combined-ca-bundle\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.832426 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4db14a6b-d372-48be-86a1-bf651618b4a4-ovs-rundir\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.832685 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db14a6b-d372-48be-86a1-bf651618b4a4-config\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.832948 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4db14a6b-d372-48be-86a1-bf651618b4a4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.833159 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4db14a6b-d372-48be-86a1-bf651618b4a4-ovn-rundir\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.833428 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgb6g\" (UniqueName: \"kubernetes.io/projected/4db14a6b-d372-48be-86a1-bf651618b4a4-kube-api-access-vgb6g\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.887150 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nc2v8"] Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.929276 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-fmxtl"] Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.931327 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.935741 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4db14a6b-d372-48be-86a1-bf651618b4a4-combined-ca-bundle\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.935811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4db14a6b-d372-48be-86a1-bf651618b4a4-ovs-rundir\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.935849 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db14a6b-d372-48be-86a1-bf651618b4a4-config\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.935881 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4db14a6b-d372-48be-86a1-bf651618b4a4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.935917 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4db14a6b-d372-48be-86a1-bf651618b4a4-ovn-rundir\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.935972 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgb6g\" (UniqueName: \"kubernetes.io/projected/4db14a6b-d372-48be-86a1-bf651618b4a4-kube-api-access-vgb6g\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.941730 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.942570 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4db14a6b-d372-48be-86a1-bf651618b4a4-ovs-rundir\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.943008 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db14a6b-d372-48be-86a1-bf651618b4a4-config\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.943135 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4db14a6b-d372-48be-86a1-bf651618b4a4-ovn-rundir\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.945900 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4db14a6b-d372-48be-86a1-bf651618b4a4-combined-ca-bundle\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.946593 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-fmxtl"] Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.948403 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4db14a6b-d372-48be-86a1-bf651618b4a4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.968337 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgb6g\" (UniqueName: \"kubernetes.io/projected/4db14a6b-d372-48be-86a1-bf651618b4a4-kube-api-access-vgb6g\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.014363 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.038370 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.038467 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.038626 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-config\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.038812 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl6fl\" (UniqueName: \"kubernetes.io/projected/2bb93a9a-6443-4352-b7ae-64f953af06c3-kube-api-access-kl6fl\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.086584 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cwgfl"] Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.114195 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-p289h"] Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.115662 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.117967 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.128622 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-p289h"] Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.141229 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-config\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.141356 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl6fl\" (UniqueName: \"kubernetes.io/projected/2bb93a9a-6443-4352-b7ae-64f953af06c3-kube-api-access-kl6fl\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.141406 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.141433 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.142247 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-config\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.142358 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.142687 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.162628 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl6fl\" (UniqueName: \"kubernetes.io/projected/2bb93a9a-6443-4352-b7ae-64f953af06c3-kube-api-access-kl6fl\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.243451 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.243552 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-dns-svc\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.243577 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-config\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.243662 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.243700 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkvzb\" (UniqueName: \"kubernetes.io/projected/e921061b-2a0f-4b22-beb1-0d52993dc06b-kube-api-access-xkvzb\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.325873 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.345482 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.345568 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-dns-svc\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.345590 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-config\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.345635 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.345680 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkvzb\" (UniqueName: \"kubernetes.io/projected/e921061b-2a0f-4b22-beb1-0d52993dc06b-kube-api-access-xkvzb\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.346589 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.346792 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.346792 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-dns-svc\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.346897 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-config\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.364846 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkvzb\" (UniqueName: \"kubernetes.io/projected/e921061b-2a0f-4b22-beb1-0d52993dc06b-kube-api-access-xkvzb\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.434296 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.753089 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 09 13:39:48 crc kubenswrapper[4764]: I0309 13:39:48.861316 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-fmxtl"] Mar 09 13:39:48 crc kubenswrapper[4764]: W0309 13:39:48.908365 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bb93a9a_6443_4352_b7ae_64f953af06c3.slice/crio-28b675926fb711438a050500ec110912822d9d8bb336d51faa38b7f7dfaac442 WatchSource:0}: Error finding container 28b675926fb711438a050500ec110912822d9d8bb336d51faa38b7f7dfaac442: Status 404 returned error can't find the container with id 28b675926fb711438a050500ec110912822d9d8bb336d51faa38b7f7dfaac442 Mar 09 13:39:48 crc kubenswrapper[4764]: I0309 13:39:48.944794 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-p289h"] Mar 09 13:39:48 crc kubenswrapper[4764]: I0309 13:39:48.952018 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8ctgr"] Mar 09 13:39:48 crc kubenswrapper[4764]: W0309 13:39:48.967795 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4db14a6b_d372_48be_86a1_bf651618b4a4.slice/crio-5f1f3b695c9092d35c13f63640cf706db0ae1c1f6b92f018ab38db40ef8ce375 WatchSource:0}: Error finding container 5f1f3b695c9092d35c13f63640cf706db0ae1c1f6b92f018ab38db40ef8ce375: Status 404 returned error can't find the container with id 5f1f3b695c9092d35c13f63640cf706db0ae1c1f6b92f018ab38db40ef8ce375 Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.236876 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"103cd40b-aa84-4973-8e47-8a67e5994c80","Type":"ContainerStarted","Data":"410872c79b144ea284f97f2a3d87683399483aba478ba3a14219e2f185e1ae16"} Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.248302 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"047aa387-9e35-4ec6-89a9-3be60e47610b","Type":"ContainerStarted","Data":"e73ea01d875b730903dcad2e132535b2d49a086bb92f6db3352cfa5ee5ca2450"} Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.250696 4764 generic.go:334] "Generic (PLEG): container finished" podID="85351658-0136-4066-b39e-808260c4dae9" containerID="ec052281dc06284d14011281ec9b60b7e02248f374cf9034b717280c98b8f271" exitCode=0 Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.250749 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" event={"ID":"85351658-0136-4066-b39e-808260c4dae9","Type":"ContainerDied","Data":"ec052281dc06284d14011281ec9b60b7e02248f374cf9034b717280c98b8f271"} Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.258616 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b","Type":"ContainerStarted","Data":"735fc93c908deb58f7b6ff73db8f24473e71aafae363e9eb4e59d8cba57c104f"} Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.260567 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-p289h" event={"ID":"e921061b-2a0f-4b22-beb1-0d52993dc06b","Type":"ContainerStarted","Data":"9c01a77a060dbab4de5d1ba1f06fcd3807020da1983c9df162b0099cb08b09d0"} Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.261710 4764 generic.go:334] "Generic (PLEG): container finished" podID="2bb93a9a-6443-4352-b7ae-64f953af06c3" containerID="b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4" exitCode=0 Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.261755 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" event={"ID":"2bb93a9a-6443-4352-b7ae-64f953af06c3","Type":"ContainerDied","Data":"b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4"} Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.261772 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" event={"ID":"2bb93a9a-6443-4352-b7ae-64f953af06c3","Type":"ContainerStarted","Data":"28b675926fb711438a050500ec110912822d9d8bb336d51faa38b7f7dfaac442"} Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.266503 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0c87ed75-4285-4084-bce3-ee8dba7671c0","Type":"ContainerStarted","Data":"13a98e60c52c501f0ea3d76503821122c0bc72169ccfcb9e1c871be7d09f9f96"} Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.275351 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.419149893 podStartE2EDuration="34.27533139s" podCreationTimestamp="2026-03-09 13:39:15 +0000 UTC" firstStartedPulling="2026-03-09 13:39:30.275063922 +0000 UTC m=+1125.525235830" lastFinishedPulling="2026-03-09 13:39:38.131245399 +0000 UTC m=+1133.381417327" observedRunningTime="2026-03-09 13:39:49.26615364 +0000 UTC m=+1144.516325548" watchObservedRunningTime="2026-03-09 13:39:49.27533139 +0000 UTC m=+1144.525503298" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.281565 4764 generic.go:334] "Generic (PLEG): container finished" podID="04d22384-e765-4ac8-9afa-7a31f4c347b2" containerID="12c1d1328197b6dafdcdd8647a30e40939b8ce0dfd16c52057876ad12cecfc4a" exitCode=0 Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.281623 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" event={"ID":"04d22384-e765-4ac8-9afa-7a31f4c347b2","Type":"ContainerDied","Data":"12c1d1328197b6dafdcdd8647a30e40939b8ce0dfd16c52057876ad12cecfc4a"} Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.284478 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8ctgr" event={"ID":"4db14a6b-d372-48be-86a1-bf651618b4a4","Type":"ContainerStarted","Data":"5f1f3b695c9092d35c13f63640cf706db0ae1c1f6b92f018ab38db40ef8ce375"} Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.331390 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.358309222 podStartE2EDuration="24.331371651s" podCreationTimestamp="2026-03-09 13:39:25 +0000 UTC" firstStartedPulling="2026-03-09 13:39:30.42529054 +0000 UTC m=+1125.675462448" lastFinishedPulling="2026-03-09 13:39:48.398352979 +0000 UTC m=+1143.648524877" observedRunningTime="2026-03-09 13:39:49.314581033 +0000 UTC m=+1144.564752951" watchObservedRunningTime="2026-03-09 13:39:49.331371651 +0000 UTC m=+1144.581543559" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.350995 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=11.757765874 podStartE2EDuration="26.350971512s" podCreationTimestamp="2026-03-09 13:39:23 +0000 UTC" firstStartedPulling="2026-03-09 13:39:33.830109592 +0000 UTC m=+1129.080281500" lastFinishedPulling="2026-03-09 13:39:48.42331524 +0000 UTC m=+1143.673487138" observedRunningTime="2026-03-09 13:39:49.344249487 +0000 UTC m=+1144.594421395" watchObservedRunningTime="2026-03-09 13:39:49.350971512 +0000 UTC m=+1144.601143420" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.423492 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=27.881124361 podStartE2EDuration="35.423460053s" podCreationTimestamp="2026-03-09 13:39:14 +0000 UTC" firstStartedPulling="2026-03-09 13:39:30.272748322 +0000 UTC m=+1125.522920230" lastFinishedPulling="2026-03-09 13:39:37.815084014 +0000 UTC m=+1133.065255922" observedRunningTime="2026-03-09 13:39:49.41759339 +0000 UTC m=+1144.667765308" watchObservedRunningTime="2026-03-09 13:39:49.423460053 +0000 UTC m=+1144.673631961" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.471321 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.734525 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.739465 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.897365 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-config\") pod \"85351658-0136-4066-b39e-808260c4dae9\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.897607 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltsrn\" (UniqueName: \"kubernetes.io/projected/85351658-0136-4066-b39e-808260c4dae9-kube-api-access-ltsrn\") pod \"85351658-0136-4066-b39e-808260c4dae9\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.897638 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-dns-svc\") pod \"85351658-0136-4066-b39e-808260c4dae9\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.897691 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-dns-svc\") pod \"04d22384-e765-4ac8-9afa-7a31f4c347b2\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.897740 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-config\") pod \"04d22384-e765-4ac8-9afa-7a31f4c347b2\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.897770 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dtxl\" (UniqueName: \"kubernetes.io/projected/04d22384-e765-4ac8-9afa-7a31f4c347b2-kube-api-access-7dtxl\") pod \"04d22384-e765-4ac8-9afa-7a31f4c347b2\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.902962 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85351658-0136-4066-b39e-808260c4dae9-kube-api-access-ltsrn" (OuterVolumeSpecName: "kube-api-access-ltsrn") pod "85351658-0136-4066-b39e-808260c4dae9" (UID: "85351658-0136-4066-b39e-808260c4dae9"). InnerVolumeSpecName "kube-api-access-ltsrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.903940 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04d22384-e765-4ac8-9afa-7a31f4c347b2-kube-api-access-7dtxl" (OuterVolumeSpecName: "kube-api-access-7dtxl") pod "04d22384-e765-4ac8-9afa-7a31f4c347b2" (UID: "04d22384-e765-4ac8-9afa-7a31f4c347b2"). InnerVolumeSpecName "kube-api-access-7dtxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.919610 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-config" (OuterVolumeSpecName: "config") pod "04d22384-e765-4ac8-9afa-7a31f4c347b2" (UID: "04d22384-e765-4ac8-9afa-7a31f4c347b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.920832 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-config" (OuterVolumeSpecName: "config") pod "85351658-0136-4066-b39e-808260c4dae9" (UID: "85351658-0136-4066-b39e-808260c4dae9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.922048 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "85351658-0136-4066-b39e-808260c4dae9" (UID: "85351658-0136-4066-b39e-808260c4dae9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.925134 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04d22384-e765-4ac8-9afa-7a31f4c347b2" (UID: "04d22384-e765-4ac8-9afa-7a31f4c347b2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.999746 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltsrn\" (UniqueName: \"kubernetes.io/projected/85351658-0136-4066-b39e-808260c4dae9-kube-api-access-ltsrn\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.999791 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.999802 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.999811 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.999821 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dtxl\" (UniqueName: \"kubernetes.io/projected/04d22384-e765-4ac8-9afa-7a31f4c347b2-kube-api-access-7dtxl\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.999830 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.087027 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.295990 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" event={"ID":"2bb93a9a-6443-4352-b7ae-64f953af06c3","Type":"ContainerStarted","Data":"c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564"} Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.296460 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.301779 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.301776 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" event={"ID":"85351658-0136-4066-b39e-808260c4dae9","Type":"ContainerDied","Data":"cf38f50edd1cd803b2485a43d9bb0cf84c48deb7190f61ca1abae1471ed84bf4"} Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.301854 4764 scope.go:117] "RemoveContainer" containerID="ec052281dc06284d14011281ec9b60b7e02248f374cf9034b717280c98b8f271" Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.304335 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" event={"ID":"04d22384-e765-4ac8-9afa-7a31f4c347b2","Type":"ContainerDied","Data":"e9eba66ff4604aeb6e4bde7029daea515e55ea3fe70bfdcf16702550cc3774f9"} Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.304427 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.306116 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8ctgr" event={"ID":"4db14a6b-d372-48be-86a1-bf651618b4a4","Type":"ContainerStarted","Data":"4c99c68cd386dd5f2ee4e470f12775336204f36e86a09f448217410fe83d4556"} Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.309222 4764 generic.go:334] "Generic (PLEG): container finished" podID="e921061b-2a0f-4b22-beb1-0d52993dc06b" containerID="7c314d6ef6d0a466bb5e22ea5e084d1cbdc6ed4e3e9ab6aee110b11fa331f5fb" exitCode=0 Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.309413 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-p289h" event={"ID":"e921061b-2a0f-4b22-beb1-0d52993dc06b","Type":"ContainerDied","Data":"7c314d6ef6d0a466bb5e22ea5e084d1cbdc6ed4e3e9ab6aee110b11fa331f5fb"} Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.326029 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" podStartSLOduration=4.32600759 podStartE2EDuration="4.32600759s" podCreationTimestamp="2026-03-09 13:39:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:39:50.324709166 +0000 UTC m=+1145.574881084" watchObservedRunningTime="2026-03-09 13:39:50.32600759 +0000 UTC m=+1145.576179498" Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.341011 4764 scope.go:117] "RemoveContainer" containerID="12c1d1328197b6dafdcdd8647a30e40939b8ce0dfd16c52057876ad12cecfc4a" Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.492716 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-8ctgr" podStartSLOduration=4.492689267 podStartE2EDuration="4.492689267s" podCreationTimestamp="2026-03-09 13:39:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:39:50.380140912 +0000 UTC m=+1145.630312830" watchObservedRunningTime="2026-03-09 13:39:50.492689267 +0000 UTC m=+1145.742861175" Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.578036 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nc2v8"] Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.587280 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nc2v8"] Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.603720 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cwgfl"] Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.606552 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cwgfl"] Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.673632 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.715076 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:51 crc kubenswrapper[4764]: I0309 13:39:51.319801 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-p289h" event={"ID":"e921061b-2a0f-4b22-beb1-0d52993dc06b","Type":"ContainerStarted","Data":"fce1b5bc8a63a55a1345ac17564049805c448dc5638e52c8bc6cdc086dbbb27d"} Mar 09 13:39:51 crc kubenswrapper[4764]: I0309 13:39:51.319948 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:51 crc kubenswrapper[4764]: I0309 13:39:51.322706 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:51 crc kubenswrapper[4764]: I0309 13:39:51.379223 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:51 crc kubenswrapper[4764]: I0309 13:39:51.416312 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-p289h" podStartSLOduration=4.416292992 podStartE2EDuration="4.416292992s" podCreationTimestamp="2026-03-09 13:39:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:39:51.344636383 +0000 UTC m=+1146.594808301" watchObservedRunningTime="2026-03-09 13:39:51.416292992 +0000 UTC m=+1146.666464900" Mar 09 13:39:51 crc kubenswrapper[4764]: I0309 13:39:51.471961 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:51 crc kubenswrapper[4764]: I0309 13:39:51.518472 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:51 crc kubenswrapper[4764]: I0309 13:39:51.569994 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04d22384-e765-4ac8-9afa-7a31f4c347b2" path="/var/lib/kubelet/pods/04d22384-e765-4ac8-9afa-7a31f4c347b2/volumes" Mar 09 13:39:51 crc kubenswrapper[4764]: I0309 13:39:51.570502 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85351658-0136-4066-b39e-808260c4dae9" path="/var/lib/kubelet/pods/85351658-0136-4066-b39e-808260c4dae9/volumes" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.484386 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.689021 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 09 13:39:52 crc kubenswrapper[4764]: E0309 13:39:52.689404 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d22384-e765-4ac8-9afa-7a31f4c347b2" containerName="init" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.689425 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d22384-e765-4ac8-9afa-7a31f4c347b2" containerName="init" Mar 09 13:39:52 crc kubenswrapper[4764]: E0309 13:39:52.689466 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85351658-0136-4066-b39e-808260c4dae9" containerName="init" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.689473 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="85351658-0136-4066-b39e-808260c4dae9" containerName="init" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.689634 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="85351658-0136-4066-b39e-808260c4dae9" containerName="init" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.691209 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="04d22384-e765-4ac8-9afa-7a31f4c347b2" containerName="init" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.692341 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.695047 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.695333 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.695535 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.696268 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-xvphx" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.720710 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.765893 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c76h5\" (UniqueName: \"kubernetes.io/projected/142a1ef0-f024-4a81-85de-72435cd72d9e-kube-api-access-c76h5\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.765968 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/142a1ef0-f024-4a81-85de-72435cd72d9e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.766033 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/142a1ef0-f024-4a81-85de-72435cd72d9e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.766066 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142a1ef0-f024-4a81-85de-72435cd72d9e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.766125 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142a1ef0-f024-4a81-85de-72435cd72d9e-config\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.766299 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/142a1ef0-f024-4a81-85de-72435cd72d9e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.766371 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/142a1ef0-f024-4a81-85de-72435cd72d9e-scripts\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.868226 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142a1ef0-f024-4a81-85de-72435cd72d9e-config\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.868297 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/142a1ef0-f024-4a81-85de-72435cd72d9e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.868336 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/142a1ef0-f024-4a81-85de-72435cd72d9e-scripts\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.868396 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c76h5\" (UniqueName: \"kubernetes.io/projected/142a1ef0-f024-4a81-85de-72435cd72d9e-kube-api-access-c76h5\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.868442 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/142a1ef0-f024-4a81-85de-72435cd72d9e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.868498 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/142a1ef0-f024-4a81-85de-72435cd72d9e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.868533 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142a1ef0-f024-4a81-85de-72435cd72d9e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.869086 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/142a1ef0-f024-4a81-85de-72435cd72d9e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.869510 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142a1ef0-f024-4a81-85de-72435cd72d9e-config\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.869583 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/142a1ef0-f024-4a81-85de-72435cd72d9e-scripts\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.878451 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/142a1ef0-f024-4a81-85de-72435cd72d9e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.878812 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/142a1ef0-f024-4a81-85de-72435cd72d9e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.880482 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142a1ef0-f024-4a81-85de-72435cd72d9e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.892336 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c76h5\" (UniqueName: \"kubernetes.io/projected/142a1ef0-f024-4a81-85de-72435cd72d9e-kube-api-access-c76h5\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:53 crc kubenswrapper[4764]: I0309 13:39:53.049799 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 09 13:39:53 crc kubenswrapper[4764]: I0309 13:39:53.505684 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 09 13:39:54 crc kubenswrapper[4764]: I0309 13:39:54.352981 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"142a1ef0-f024-4a81-85de-72435cd72d9e","Type":"ContainerStarted","Data":"ccf88e978a422fc942b0c5235260f37090e1f60b98b5997f974460cdc3f6c062"} Mar 09 13:39:55 crc kubenswrapper[4764]: I0309 13:39:55.363569 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"142a1ef0-f024-4a81-85de-72435cd72d9e","Type":"ContainerStarted","Data":"6f05851cc84a9a82ca7126142476a5945146d39221905b5e2c6ca997048ccdc9"} Mar 09 13:39:55 crc kubenswrapper[4764]: I0309 13:39:55.363941 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"142a1ef0-f024-4a81-85de-72435cd72d9e","Type":"ContainerStarted","Data":"f8b383ab68c5e5e270fd806d9d8701126761656254d8c76342087690668262ca"} Mar 09 13:39:55 crc kubenswrapper[4764]: I0309 13:39:55.364110 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 09 13:39:55 crc kubenswrapper[4764]: I0309 13:39:55.418299 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.422010255 podStartE2EDuration="3.418273096s" podCreationTimestamp="2026-03-09 13:39:52 +0000 UTC" firstStartedPulling="2026-03-09 13:39:53.511973943 +0000 UTC m=+1148.762145851" lastFinishedPulling="2026-03-09 13:39:54.508236784 +0000 UTC m=+1149.758408692" observedRunningTime="2026-03-09 13:39:55.410296508 +0000 UTC m=+1150.660468416" watchObservedRunningTime="2026-03-09 13:39:55.418273096 +0000 UTC m=+1150.668445004" Mar 09 13:39:55 crc kubenswrapper[4764]: I0309 13:39:55.942221 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 09 13:39:55 crc kubenswrapper[4764]: I0309 13:39:55.942636 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 09 13:39:56 crc kubenswrapper[4764]: I0309 13:39:56.015281 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 09 13:39:56 crc kubenswrapper[4764]: I0309 13:39:56.449165 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.295868 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.296365 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.329575 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.427786 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.440025 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.512024 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-fmxtl"] Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.512243 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" podUID="2bb93a9a-6443-4352-b7ae-64f953af06c3" containerName="dnsmasq-dns" containerID="cri-o://c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564" gracePeriod=10 Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.557420 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.995671 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-d7e9-account-create-update-n7gsb"] Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.997447 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d7e9-account-create-update-n7gsb" Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.999594 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:57.999992 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.019677 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d7e9-account-create-update-n7gsb"] Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.028833 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-wkxnp"] Mar 09 13:39:58 crc kubenswrapper[4764]: E0309 13:39:58.039874 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb93a9a-6443-4352-b7ae-64f953af06c3" containerName="dnsmasq-dns" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.039916 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb93a9a-6443-4352-b7ae-64f953af06c3" containerName="dnsmasq-dns" Mar 09 13:39:58 crc kubenswrapper[4764]: E0309 13:39:58.039970 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb93a9a-6443-4352-b7ae-64f953af06c3" containerName="init" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.039982 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb93a9a-6443-4352-b7ae-64f953af06c3" containerName="init" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.040363 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb93a9a-6443-4352-b7ae-64f953af06c3" containerName="dnsmasq-dns" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.041097 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-wkxnp"] Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.041202 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wkxnp" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.071596 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-ovsdbserver-nb\") pod \"2bb93a9a-6443-4352-b7ae-64f953af06c3\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.071716 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-config\") pod \"2bb93a9a-6443-4352-b7ae-64f953af06c3\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.071740 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-dns-svc\") pod \"2bb93a9a-6443-4352-b7ae-64f953af06c3\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.071781 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl6fl\" (UniqueName: \"kubernetes.io/projected/2bb93a9a-6443-4352-b7ae-64f953af06c3-kube-api-access-kl6fl\") pod \"2bb93a9a-6443-4352-b7ae-64f953af06c3\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.072059 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdsck\" (UniqueName: \"kubernetes.io/projected/75f29150-3689-48a6-9248-b6774f85fcd2-kube-api-access-vdsck\") pod \"glance-db-create-wkxnp\" (UID: \"75f29150-3689-48a6-9248-b6774f85fcd2\") " pod="openstack/glance-db-create-wkxnp" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.072087 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693ba99b-99d0-4b09-9f49-9deefe05abac-operator-scripts\") pod \"glance-d7e9-account-create-update-n7gsb\" (UID: \"693ba99b-99d0-4b09-9f49-9deefe05abac\") " pod="openstack/glance-d7e9-account-create-update-n7gsb" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.072150 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hszn\" (UniqueName: \"kubernetes.io/projected/693ba99b-99d0-4b09-9f49-9deefe05abac-kube-api-access-2hszn\") pod \"glance-d7e9-account-create-update-n7gsb\" (UID: \"693ba99b-99d0-4b09-9f49-9deefe05abac\") " pod="openstack/glance-d7e9-account-create-update-n7gsb" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.072222 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75f29150-3689-48a6-9248-b6774f85fcd2-operator-scripts\") pod \"glance-db-create-wkxnp\" (UID: \"75f29150-3689-48a6-9248-b6774f85fcd2\") " pod="openstack/glance-db-create-wkxnp" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.084938 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bb93a9a-6443-4352-b7ae-64f953af06c3-kube-api-access-kl6fl" (OuterVolumeSpecName: "kube-api-access-kl6fl") pod "2bb93a9a-6443-4352-b7ae-64f953af06c3" (UID: "2bb93a9a-6443-4352-b7ae-64f953af06c3"). InnerVolumeSpecName "kube-api-access-kl6fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.133270 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2bb93a9a-6443-4352-b7ae-64f953af06c3" (UID: "2bb93a9a-6443-4352-b7ae-64f953af06c3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.144240 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2bb93a9a-6443-4352-b7ae-64f953af06c3" (UID: "2bb93a9a-6443-4352-b7ae-64f953af06c3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.145511 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-config" (OuterVolumeSpecName: "config") pod "2bb93a9a-6443-4352-b7ae-64f953af06c3" (UID: "2bb93a9a-6443-4352-b7ae-64f953af06c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.174258 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75f29150-3689-48a6-9248-b6774f85fcd2-operator-scripts\") pod \"glance-db-create-wkxnp\" (UID: \"75f29150-3689-48a6-9248-b6774f85fcd2\") " pod="openstack/glance-db-create-wkxnp" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.174364 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdsck\" (UniqueName: \"kubernetes.io/projected/75f29150-3689-48a6-9248-b6774f85fcd2-kube-api-access-vdsck\") pod \"glance-db-create-wkxnp\" (UID: \"75f29150-3689-48a6-9248-b6774f85fcd2\") " pod="openstack/glance-db-create-wkxnp" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.174388 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693ba99b-99d0-4b09-9f49-9deefe05abac-operator-scripts\") pod \"glance-d7e9-account-create-update-n7gsb\" (UID: \"693ba99b-99d0-4b09-9f49-9deefe05abac\") " pod="openstack/glance-d7e9-account-create-update-n7gsb" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.174438 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hszn\" (UniqueName: \"kubernetes.io/projected/693ba99b-99d0-4b09-9f49-9deefe05abac-kube-api-access-2hszn\") pod \"glance-d7e9-account-create-update-n7gsb\" (UID: \"693ba99b-99d0-4b09-9f49-9deefe05abac\") " pod="openstack/glance-d7e9-account-create-update-n7gsb" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.174628 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.175077 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.175110 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.175123 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl6fl\" (UniqueName: \"kubernetes.io/projected/2bb93a9a-6443-4352-b7ae-64f953af06c3-kube-api-access-kl6fl\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.175397 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75f29150-3689-48a6-9248-b6774f85fcd2-operator-scripts\") pod \"glance-db-create-wkxnp\" (UID: \"75f29150-3689-48a6-9248-b6774f85fcd2\") " pod="openstack/glance-db-create-wkxnp" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.175514 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693ba99b-99d0-4b09-9f49-9deefe05abac-operator-scripts\") pod \"glance-d7e9-account-create-update-n7gsb\" (UID: \"693ba99b-99d0-4b09-9f49-9deefe05abac\") " pod="openstack/glance-d7e9-account-create-update-n7gsb" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.192933 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hszn\" (UniqueName: \"kubernetes.io/projected/693ba99b-99d0-4b09-9f49-9deefe05abac-kube-api-access-2hszn\") pod \"glance-d7e9-account-create-update-n7gsb\" (UID: \"693ba99b-99d0-4b09-9f49-9deefe05abac\") " pod="openstack/glance-d7e9-account-create-update-n7gsb" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.194588 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdsck\" (UniqueName: \"kubernetes.io/projected/75f29150-3689-48a6-9248-b6774f85fcd2-kube-api-access-vdsck\") pod \"glance-db-create-wkxnp\" (UID: \"75f29150-3689-48a6-9248-b6774f85fcd2\") " pod="openstack/glance-db-create-wkxnp" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.324705 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d7e9-account-create-update-n7gsb" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.364924 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wkxnp" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.370240 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.370285 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.390128 4764 generic.go:334] "Generic (PLEG): container finished" podID="2bb93a9a-6443-4352-b7ae-64f953af06c3" containerID="c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564" exitCode=0 Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.390240 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.390474 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" event={"ID":"2bb93a9a-6443-4352-b7ae-64f953af06c3","Type":"ContainerDied","Data":"c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564"} Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.390588 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" event={"ID":"2bb93a9a-6443-4352-b7ae-64f953af06c3","Type":"ContainerDied","Data":"28b675926fb711438a050500ec110912822d9d8bb336d51faa38b7f7dfaac442"} Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.390713 4764 scope.go:117] "RemoveContainer" containerID="c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.433086 4764 scope.go:117] "RemoveContainer" containerID="b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.437292 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-fmxtl"] Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.447324 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-fmxtl"] Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.485944 4764 scope.go:117] "RemoveContainer" containerID="c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564" Mar 09 13:39:58 crc kubenswrapper[4764]: E0309 13:39:58.495333 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564\": container with ID starting with c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564 not found: ID does not exist" containerID="c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.495385 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564"} err="failed to get container status \"c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564\": rpc error: code = NotFound desc = could not find container \"c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564\": container with ID starting with c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564 not found: ID does not exist" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.495420 4764 scope.go:117] "RemoveContainer" containerID="b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4" Mar 09 13:39:58 crc kubenswrapper[4764]: E0309 13:39:58.498910 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4\": container with ID starting with b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4 not found: ID does not exist" containerID="b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.498940 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4"} err="failed to get container status \"b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4\": rpc error: code = NotFound desc = could not find container \"b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4\": container with ID starting with b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4 not found: ID does not exist" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.764420 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-66ln9"] Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.774886 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-66ln9" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.781973 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-66ln9"] Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.829078 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d7e9-account-create-update-n7gsb"] Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.894283 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9pfh\" (UniqueName: \"kubernetes.io/projected/811ef770-3be6-4f3b-9fc3-dee4df710c4f-kube-api-access-c9pfh\") pod \"keystone-db-create-66ln9\" (UID: \"811ef770-3be6-4f3b-9fc3-dee4df710c4f\") " pod="openstack/keystone-db-create-66ln9" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.895068 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811ef770-3be6-4f3b-9fc3-dee4df710c4f-operator-scripts\") pod \"keystone-db-create-66ln9\" (UID: \"811ef770-3be6-4f3b-9fc3-dee4df710c4f\") " pod="openstack/keystone-db-create-66ln9" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.924023 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-594d-account-create-update-dxsw5"] Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.934693 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-594d-account-create-update-dxsw5" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.945360 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.970273 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-594d-account-create-update-dxsw5"] Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.992953 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-kn2lh"] Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.994576 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kn2lh" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.997911 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9pfh\" (UniqueName: \"kubernetes.io/projected/811ef770-3be6-4f3b-9fc3-dee4df710c4f-kube-api-access-c9pfh\") pod \"keystone-db-create-66ln9\" (UID: \"811ef770-3be6-4f3b-9fc3-dee4df710c4f\") " pod="openstack/keystone-db-create-66ln9" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.998042 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811ef770-3be6-4f3b-9fc3-dee4df710c4f-operator-scripts\") pod \"keystone-db-create-66ln9\" (UID: \"811ef770-3be6-4f3b-9fc3-dee4df710c4f\") " pod="openstack/keystone-db-create-66ln9" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:58.999051 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811ef770-3be6-4f3b-9fc3-dee4df710c4f-operator-scripts\") pod \"keystone-db-create-66ln9\" (UID: \"811ef770-3be6-4f3b-9fc3-dee4df710c4f\") " pod="openstack/keystone-db-create-66ln9" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.028974 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kn2lh"] Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.054402 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-wkxnp"] Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.067559 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9pfh\" (UniqueName: \"kubernetes.io/projected/811ef770-3be6-4f3b-9fc3-dee4df710c4f-kube-api-access-c9pfh\") pod \"keystone-db-create-66ln9\" (UID: \"811ef770-3be6-4f3b-9fc3-dee4df710c4f\") " pod="openstack/keystone-db-create-66ln9" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.081363 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-0f8b-account-create-update-mxbcn"] Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.082808 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0f8b-account-create-update-mxbcn" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.087461 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.092327 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0f8b-account-create-update-mxbcn"] Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.100695 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-665kq\" (UniqueName: \"kubernetes.io/projected/7d681487-9af9-48e3-bb79-569b8c7bf26d-kube-api-access-665kq\") pod \"placement-db-create-kn2lh\" (UID: \"7d681487-9af9-48e3-bb79-569b8c7bf26d\") " pod="openstack/placement-db-create-kn2lh" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.100767 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01e4dc90-6790-447b-ac2a-d2dfcde88d17-operator-scripts\") pod \"keystone-594d-account-create-update-dxsw5\" (UID: \"01e4dc90-6790-447b-ac2a-d2dfcde88d17\") " pod="openstack/keystone-594d-account-create-update-dxsw5" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.100928 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d681487-9af9-48e3-bb79-569b8c7bf26d-operator-scripts\") pod \"placement-db-create-kn2lh\" (UID: \"7d681487-9af9-48e3-bb79-569b8c7bf26d\") " pod="openstack/placement-db-create-kn2lh" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.101076 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpt8k\" (UniqueName: \"kubernetes.io/projected/01e4dc90-6790-447b-ac2a-d2dfcde88d17-kube-api-access-lpt8k\") pod \"keystone-594d-account-create-update-dxsw5\" (UID: \"01e4dc90-6790-447b-ac2a-d2dfcde88d17\") " pod="openstack/keystone-594d-account-create-update-dxsw5" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.174020 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-66ln9" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.204215 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d681487-9af9-48e3-bb79-569b8c7bf26d-operator-scripts\") pod \"placement-db-create-kn2lh\" (UID: \"7d681487-9af9-48e3-bb79-569b8c7bf26d\") " pod="openstack/placement-db-create-kn2lh" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.204330 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rst47\" (UniqueName: \"kubernetes.io/projected/9d27c011-b8dd-4f14-9833-413f7a8faf8a-kube-api-access-rst47\") pod \"placement-0f8b-account-create-update-mxbcn\" (UID: \"9d27c011-b8dd-4f14-9833-413f7a8faf8a\") " pod="openstack/placement-0f8b-account-create-update-mxbcn" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.204395 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpt8k\" (UniqueName: \"kubernetes.io/projected/01e4dc90-6790-447b-ac2a-d2dfcde88d17-kube-api-access-lpt8k\") pod \"keystone-594d-account-create-update-dxsw5\" (UID: \"01e4dc90-6790-447b-ac2a-d2dfcde88d17\") " pod="openstack/keystone-594d-account-create-update-dxsw5" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.204453 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d27c011-b8dd-4f14-9833-413f7a8faf8a-operator-scripts\") pod \"placement-0f8b-account-create-update-mxbcn\" (UID: \"9d27c011-b8dd-4f14-9833-413f7a8faf8a\") " pod="openstack/placement-0f8b-account-create-update-mxbcn" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.204504 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-665kq\" (UniqueName: \"kubernetes.io/projected/7d681487-9af9-48e3-bb79-569b8c7bf26d-kube-api-access-665kq\") pod \"placement-db-create-kn2lh\" (UID: \"7d681487-9af9-48e3-bb79-569b8c7bf26d\") " pod="openstack/placement-db-create-kn2lh" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.204661 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01e4dc90-6790-447b-ac2a-d2dfcde88d17-operator-scripts\") pod \"keystone-594d-account-create-update-dxsw5\" (UID: \"01e4dc90-6790-447b-ac2a-d2dfcde88d17\") " pod="openstack/keystone-594d-account-create-update-dxsw5" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.205225 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d681487-9af9-48e3-bb79-569b8c7bf26d-operator-scripts\") pod \"placement-db-create-kn2lh\" (UID: \"7d681487-9af9-48e3-bb79-569b8c7bf26d\") " pod="openstack/placement-db-create-kn2lh" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.205792 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01e4dc90-6790-447b-ac2a-d2dfcde88d17-operator-scripts\") pod \"keystone-594d-account-create-update-dxsw5\" (UID: \"01e4dc90-6790-447b-ac2a-d2dfcde88d17\") " pod="openstack/keystone-594d-account-create-update-dxsw5" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.228709 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpt8k\" (UniqueName: \"kubernetes.io/projected/01e4dc90-6790-447b-ac2a-d2dfcde88d17-kube-api-access-lpt8k\") pod \"keystone-594d-account-create-update-dxsw5\" (UID: \"01e4dc90-6790-447b-ac2a-d2dfcde88d17\") " pod="openstack/keystone-594d-account-create-update-dxsw5" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.232237 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-665kq\" (UniqueName: \"kubernetes.io/projected/7d681487-9af9-48e3-bb79-569b8c7bf26d-kube-api-access-665kq\") pod \"placement-db-create-kn2lh\" (UID: \"7d681487-9af9-48e3-bb79-569b8c7bf26d\") " pod="openstack/placement-db-create-kn2lh" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.306987 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rst47\" (UniqueName: \"kubernetes.io/projected/9d27c011-b8dd-4f14-9833-413f7a8faf8a-kube-api-access-rst47\") pod \"placement-0f8b-account-create-update-mxbcn\" (UID: \"9d27c011-b8dd-4f14-9833-413f7a8faf8a\") " pod="openstack/placement-0f8b-account-create-update-mxbcn" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.307443 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d27c011-b8dd-4f14-9833-413f7a8faf8a-operator-scripts\") pod \"placement-0f8b-account-create-update-mxbcn\" (UID: \"9d27c011-b8dd-4f14-9833-413f7a8faf8a\") " pod="openstack/placement-0f8b-account-create-update-mxbcn" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.308250 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d27c011-b8dd-4f14-9833-413f7a8faf8a-operator-scripts\") pod \"placement-0f8b-account-create-update-mxbcn\" (UID: \"9d27c011-b8dd-4f14-9833-413f7a8faf8a\") " pod="openstack/placement-0f8b-account-create-update-mxbcn" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.330669 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-594d-account-create-update-dxsw5" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.350443 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rst47\" (UniqueName: \"kubernetes.io/projected/9d27c011-b8dd-4f14-9833-413f7a8faf8a-kube-api-access-rst47\") pod \"placement-0f8b-account-create-update-mxbcn\" (UID: \"9d27c011-b8dd-4f14-9833-413f7a8faf8a\") " pod="openstack/placement-0f8b-account-create-update-mxbcn" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.415805 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d7e9-account-create-update-n7gsb" event={"ID":"693ba99b-99d0-4b09-9f49-9deefe05abac","Type":"ContainerStarted","Data":"da76ff68a5f6a4d7d50712be1138bbea55547749d69b4508eda52be21fd47ce6"} Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.415852 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d7e9-account-create-update-n7gsb" event={"ID":"693ba99b-99d0-4b09-9f49-9deefe05abac","Type":"ContainerStarted","Data":"f80fe4762d2af82804e15db7d3b8d4627b64c32ef1882fdf483222746daef33d"} Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.438172 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wkxnp" event={"ID":"75f29150-3689-48a6-9248-b6774f85fcd2","Type":"ContainerStarted","Data":"98058d7a077d55dcc4dc3081744bc24f8ca7e82793695f8748e2850e19fbd5a3"} Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.438225 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wkxnp" event={"ID":"75f29150-3689-48a6-9248-b6774f85fcd2","Type":"ContainerStarted","Data":"f6bcb6e597f958bade23e793909cd2f5b0c44f96f937977251a4fc97d7a51cda"} Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.444510 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-d7e9-account-create-update-n7gsb" podStartSLOduration=2.444484843 podStartE2EDuration="2.444484843s" podCreationTimestamp="2026-03-09 13:39:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:39:59.437881491 +0000 UTC m=+1154.688053419" watchObservedRunningTime="2026-03-09 13:39:59.444484843 +0000 UTC m=+1154.694656761" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.462537 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kn2lh" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.463734 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0f8b-account-create-update-mxbcn" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.471991 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-wkxnp" podStartSLOduration=2.471955059 podStartE2EDuration="2.471955059s" podCreationTimestamp="2026-03-09 13:39:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:39:59.46315196 +0000 UTC m=+1154.713323868" watchObservedRunningTime="2026-03-09 13:39:59.471955059 +0000 UTC m=+1154.722126967" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.603462 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bb93a9a-6443-4352-b7ae-64f953af06c3" path="/var/lib/kubelet/pods/2bb93a9a-6443-4352-b7ae-64f953af06c3/volumes" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.748365 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-66ln9"] Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.909529 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-594d-account-create-update-dxsw5"] Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.122152 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kn2lh"] Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.148692 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551060-n7f54"] Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.154321 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551060-n7f54" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.157385 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551060-n7f54"] Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.160429 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.160477 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.161039 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.225320 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb5hw\" (UniqueName: \"kubernetes.io/projected/16623a65-1bef-4faa-a891-bae0a7d04977-kube-api-access-gb5hw\") pod \"auto-csr-approver-29551060-n7f54\" (UID: \"16623a65-1bef-4faa-a891-bae0a7d04977\") " pod="openshift-infra/auto-csr-approver-29551060-n7f54" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.241616 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0f8b-account-create-update-mxbcn"] Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.327497 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb5hw\" (UniqueName: \"kubernetes.io/projected/16623a65-1bef-4faa-a891-bae0a7d04977-kube-api-access-gb5hw\") pod \"auto-csr-approver-29551060-n7f54\" (UID: \"16623a65-1bef-4faa-a891-bae0a7d04977\") " pod="openshift-infra/auto-csr-approver-29551060-n7f54" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.352961 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb5hw\" (UniqueName: \"kubernetes.io/projected/16623a65-1bef-4faa-a891-bae0a7d04977-kube-api-access-gb5hw\") pod \"auto-csr-approver-29551060-n7f54\" (UID: \"16623a65-1bef-4faa-a891-bae0a7d04977\") " pod="openshift-infra/auto-csr-approver-29551060-n7f54" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.451678 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-594d-account-create-update-dxsw5" event={"ID":"01e4dc90-6790-447b-ac2a-d2dfcde88d17","Type":"ContainerStarted","Data":"6de1cfcabddc41034a2e1c58fb3ef2484991e0985876cbbdc822fe1df3641db1"} Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.451752 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-594d-account-create-update-dxsw5" event={"ID":"01e4dc90-6790-447b-ac2a-d2dfcde88d17","Type":"ContainerStarted","Data":"6db0dcbeaa0e445d9563d80bdb4ee91587520750c60db46e9ceacf66f6d60c54"} Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.455406 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0f8b-account-create-update-mxbcn" event={"ID":"9d27c011-b8dd-4f14-9833-413f7a8faf8a","Type":"ContainerStarted","Data":"01ef1c726b7e7270c862ba5b9e31c73fecc7bf0ea188cc226f1bf52e6cb5af33"} Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.455440 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0f8b-account-create-update-mxbcn" event={"ID":"9d27c011-b8dd-4f14-9833-413f7a8faf8a","Type":"ContainerStarted","Data":"1f18354cc3ede52419192c5d865dc6321532287310adb5cffdafe9ba08f30940"} Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.460691 4764 generic.go:334] "Generic (PLEG): container finished" podID="693ba99b-99d0-4b09-9f49-9deefe05abac" containerID="da76ff68a5f6a4d7d50712be1138bbea55547749d69b4508eda52be21fd47ce6" exitCode=0 Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.460775 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d7e9-account-create-update-n7gsb" event={"ID":"693ba99b-99d0-4b09-9f49-9deefe05abac","Type":"ContainerDied","Data":"da76ff68a5f6a4d7d50712be1138bbea55547749d69b4508eda52be21fd47ce6"} Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.463236 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kn2lh" event={"ID":"7d681487-9af9-48e3-bb79-569b8c7bf26d","Type":"ContainerStarted","Data":"6bccf5846329079b445a150aae1cbe1d8637bb12a3644ba9afce7863cefc0fe8"} Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.463291 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kn2lh" event={"ID":"7d681487-9af9-48e3-bb79-569b8c7bf26d","Type":"ContainerStarted","Data":"cb35dabb9f5f0edd526374363673fcbd6955aa3235f600a648fe50273b0796dd"} Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.465087 4764 generic.go:334] "Generic (PLEG): container finished" podID="75f29150-3689-48a6-9248-b6774f85fcd2" containerID="98058d7a077d55dcc4dc3081744bc24f8ca7e82793695f8748e2850e19fbd5a3" exitCode=0 Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.465197 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wkxnp" event={"ID":"75f29150-3689-48a6-9248-b6774f85fcd2","Type":"ContainerDied","Data":"98058d7a077d55dcc4dc3081744bc24f8ca7e82793695f8748e2850e19fbd5a3"} Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.490781 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551060-n7f54" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.499965 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-66ln9" event={"ID":"811ef770-3be6-4f3b-9fc3-dee4df710c4f","Type":"ContainerStarted","Data":"cb8ff00b99c398bb890b473678e6ce951f3d71cc55317946f469bc84cba9ae54"} Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.500052 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-66ln9" event={"ID":"811ef770-3be6-4f3b-9fc3-dee4df710c4f","Type":"ContainerStarted","Data":"616b45c0cfdfa8c2edf2207afc781be5cb32d60b419fa2c0f9c02788b1b97cc1"} Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.511243 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-594d-account-create-update-dxsw5" podStartSLOduration=2.511205151 podStartE2EDuration="2.511205151s" podCreationTimestamp="2026-03-09 13:39:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:00.482795351 +0000 UTC m=+1155.732967259" watchObservedRunningTime="2026-03-09 13:40:00.511205151 +0000 UTC m=+1155.761377049" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.546892 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-0f8b-account-create-update-mxbcn" podStartSLOduration=1.546859171 podStartE2EDuration="1.546859171s" podCreationTimestamp="2026-03-09 13:39:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:00.537363824 +0000 UTC m=+1155.787535732" watchObservedRunningTime="2026-03-09 13:40:00.546859171 +0000 UTC m=+1155.797031079" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.634029 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-kn2lh" podStartSLOduration=2.6339984039999997 podStartE2EDuration="2.633998404s" podCreationTimestamp="2026-03-09 13:39:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:00.602264176 +0000 UTC m=+1155.852436084" watchObservedRunningTime="2026-03-09 13:40:00.633998404 +0000 UTC m=+1155.884170312" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.648858 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-66ln9" podStartSLOduration=2.648834021 podStartE2EDuration="2.648834021s" podCreationTimestamp="2026-03-09 13:39:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:00.640565375 +0000 UTC m=+1155.890737283" watchObservedRunningTime="2026-03-09 13:40:00.648834021 +0000 UTC m=+1155.899005929" Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.067787 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551060-n7f54"] Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.087110 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.512717 4764 generic.go:334] "Generic (PLEG): container finished" podID="01e4dc90-6790-447b-ac2a-d2dfcde88d17" containerID="6de1cfcabddc41034a2e1c58fb3ef2484991e0985876cbbdc822fe1df3641db1" exitCode=0 Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.512851 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-594d-account-create-update-dxsw5" event={"ID":"01e4dc90-6790-447b-ac2a-d2dfcde88d17","Type":"ContainerDied","Data":"6de1cfcabddc41034a2e1c58fb3ef2484991e0985876cbbdc822fe1df3641db1"} Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.517289 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d27c011-b8dd-4f14-9833-413f7a8faf8a" containerID="01ef1c726b7e7270c862ba5b9e31c73fecc7bf0ea188cc226f1bf52e6cb5af33" exitCode=0 Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.517339 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0f8b-account-create-update-mxbcn" event={"ID":"9d27c011-b8dd-4f14-9833-413f7a8faf8a","Type":"ContainerDied","Data":"01ef1c726b7e7270c862ba5b9e31c73fecc7bf0ea188cc226f1bf52e6cb5af33"} Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.519457 4764 generic.go:334] "Generic (PLEG): container finished" podID="7d681487-9af9-48e3-bb79-569b8c7bf26d" containerID="6bccf5846329079b445a150aae1cbe1d8637bb12a3644ba9afce7863cefc0fe8" exitCode=0 Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.519607 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kn2lh" event={"ID":"7d681487-9af9-48e3-bb79-569b8c7bf26d","Type":"ContainerDied","Data":"6bccf5846329079b445a150aae1cbe1d8637bb12a3644ba9afce7863cefc0fe8"} Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.522321 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551060-n7f54" event={"ID":"16623a65-1bef-4faa-a891-bae0a7d04977","Type":"ContainerStarted","Data":"7002d9bc70f8ed4dbd1fb5ae2c17202e34dbcfe2e8bc11fc9b82eda847bd6796"} Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.530768 4764 generic.go:334] "Generic (PLEG): container finished" podID="811ef770-3be6-4f3b-9fc3-dee4df710c4f" containerID="cb8ff00b99c398bb890b473678e6ce951f3d71cc55317946f469bc84cba9ae54" exitCode=0 Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.531228 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-66ln9" event={"ID":"811ef770-3be6-4f3b-9fc3-dee4df710c4f","Type":"ContainerDied","Data":"cb8ff00b99c398bb890b473678e6ce951f3d71cc55317946f469bc84cba9ae54"} Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.032203 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d7e9-account-create-update-n7gsb" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.039414 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wkxnp" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.170521 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hszn\" (UniqueName: \"kubernetes.io/projected/693ba99b-99d0-4b09-9f49-9deefe05abac-kube-api-access-2hszn\") pod \"693ba99b-99d0-4b09-9f49-9deefe05abac\" (UID: \"693ba99b-99d0-4b09-9f49-9deefe05abac\") " Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.170628 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75f29150-3689-48a6-9248-b6774f85fcd2-operator-scripts\") pod \"75f29150-3689-48a6-9248-b6774f85fcd2\" (UID: \"75f29150-3689-48a6-9248-b6774f85fcd2\") " Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.170705 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdsck\" (UniqueName: \"kubernetes.io/projected/75f29150-3689-48a6-9248-b6774f85fcd2-kube-api-access-vdsck\") pod \"75f29150-3689-48a6-9248-b6774f85fcd2\" (UID: \"75f29150-3689-48a6-9248-b6774f85fcd2\") " Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.170912 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693ba99b-99d0-4b09-9f49-9deefe05abac-operator-scripts\") pod \"693ba99b-99d0-4b09-9f49-9deefe05abac\" (UID: \"693ba99b-99d0-4b09-9f49-9deefe05abac\") " Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.171610 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f29150-3689-48a6-9248-b6774f85fcd2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "75f29150-3689-48a6-9248-b6774f85fcd2" (UID: "75f29150-3689-48a6-9248-b6774f85fcd2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.171996 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/693ba99b-99d0-4b09-9f49-9deefe05abac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "693ba99b-99d0-4b09-9f49-9deefe05abac" (UID: "693ba99b-99d0-4b09-9f49-9deefe05abac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.184007 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75f29150-3689-48a6-9248-b6774f85fcd2-kube-api-access-vdsck" (OuterVolumeSpecName: "kube-api-access-vdsck") pod "75f29150-3689-48a6-9248-b6774f85fcd2" (UID: "75f29150-3689-48a6-9248-b6774f85fcd2"). InnerVolumeSpecName "kube-api-access-vdsck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.184104 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/693ba99b-99d0-4b09-9f49-9deefe05abac-kube-api-access-2hszn" (OuterVolumeSpecName: "kube-api-access-2hszn") pod "693ba99b-99d0-4b09-9f49-9deefe05abac" (UID: "693ba99b-99d0-4b09-9f49-9deefe05abac"). InnerVolumeSpecName "kube-api-access-2hszn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.273513 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693ba99b-99d0-4b09-9f49-9deefe05abac-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.273551 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hszn\" (UniqueName: \"kubernetes.io/projected/693ba99b-99d0-4b09-9f49-9deefe05abac-kube-api-access-2hszn\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.273567 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75f29150-3689-48a6-9248-b6774f85fcd2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.273577 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdsck\" (UniqueName: \"kubernetes.io/projected/75f29150-3689-48a6-9248-b6774f85fcd2-kube-api-access-vdsck\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.548825 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wkxnp" event={"ID":"75f29150-3689-48a6-9248-b6774f85fcd2","Type":"ContainerDied","Data":"f6bcb6e597f958bade23e793909cd2f5b0c44f96f937977251a4fc97d7a51cda"} Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.548883 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6bcb6e597f958bade23e793909cd2f5b0c44f96f937977251a4fc97d7a51cda" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.548840 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wkxnp" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.552336 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d7e9-account-create-update-n7gsb" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.555920 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d7e9-account-create-update-n7gsb" event={"ID":"693ba99b-99d0-4b09-9f49-9deefe05abac","Type":"ContainerDied","Data":"f80fe4762d2af82804e15db7d3b8d4627b64c32ef1882fdf483222746daef33d"} Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.555970 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f80fe4762d2af82804e15db7d3b8d4627b64c32ef1882fdf483222746daef33d" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.923583 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kn2lh" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.985498 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-665kq\" (UniqueName: \"kubernetes.io/projected/7d681487-9af9-48e3-bb79-569b8c7bf26d-kube-api-access-665kq\") pod \"7d681487-9af9-48e3-bb79-569b8c7bf26d\" (UID: \"7d681487-9af9-48e3-bb79-569b8c7bf26d\") " Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.985583 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d681487-9af9-48e3-bb79-569b8c7bf26d-operator-scripts\") pod \"7d681487-9af9-48e3-bb79-569b8c7bf26d\" (UID: \"7d681487-9af9-48e3-bb79-569b8c7bf26d\") " Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.986275 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d681487-9af9-48e3-bb79-569b8c7bf26d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d681487-9af9-48e3-bb79-569b8c7bf26d" (UID: "7d681487-9af9-48e3-bb79-569b8c7bf26d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.992316 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d681487-9af9-48e3-bb79-569b8c7bf26d-kube-api-access-665kq" (OuterVolumeSpecName: "kube-api-access-665kq") pod "7d681487-9af9-48e3-bb79-569b8c7bf26d" (UID: "7d681487-9af9-48e3-bb79-569b8c7bf26d"). InnerVolumeSpecName "kube-api-access-665kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.089163 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-665kq\" (UniqueName: \"kubernetes.io/projected/7d681487-9af9-48e3-bb79-569b8c7bf26d-kube-api-access-665kq\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.089222 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d681487-9af9-48e3-bb79-569b8c7bf26d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.214232 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-594d-account-create-update-dxsw5" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.222625 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0f8b-account-create-update-mxbcn" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.229993 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-66ln9" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.291731 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rst47\" (UniqueName: \"kubernetes.io/projected/9d27c011-b8dd-4f14-9833-413f7a8faf8a-kube-api-access-rst47\") pod \"9d27c011-b8dd-4f14-9833-413f7a8faf8a\" (UID: \"9d27c011-b8dd-4f14-9833-413f7a8faf8a\") " Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.292290 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d27c011-b8dd-4f14-9833-413f7a8faf8a-operator-scripts\") pod \"9d27c011-b8dd-4f14-9833-413f7a8faf8a\" (UID: \"9d27c011-b8dd-4f14-9833-413f7a8faf8a\") " Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.292385 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811ef770-3be6-4f3b-9fc3-dee4df710c4f-operator-scripts\") pod \"811ef770-3be6-4f3b-9fc3-dee4df710c4f\" (UID: \"811ef770-3be6-4f3b-9fc3-dee4df710c4f\") " Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.292475 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9pfh\" (UniqueName: \"kubernetes.io/projected/811ef770-3be6-4f3b-9fc3-dee4df710c4f-kube-api-access-c9pfh\") pod \"811ef770-3be6-4f3b-9fc3-dee4df710c4f\" (UID: \"811ef770-3be6-4f3b-9fc3-dee4df710c4f\") " Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.292520 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpt8k\" (UniqueName: \"kubernetes.io/projected/01e4dc90-6790-447b-ac2a-d2dfcde88d17-kube-api-access-lpt8k\") pod \"01e4dc90-6790-447b-ac2a-d2dfcde88d17\" (UID: \"01e4dc90-6790-447b-ac2a-d2dfcde88d17\") " Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.292672 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01e4dc90-6790-447b-ac2a-d2dfcde88d17-operator-scripts\") pod \"01e4dc90-6790-447b-ac2a-d2dfcde88d17\" (UID: \"01e4dc90-6790-447b-ac2a-d2dfcde88d17\") " Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.294662 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01e4dc90-6790-447b-ac2a-d2dfcde88d17-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "01e4dc90-6790-447b-ac2a-d2dfcde88d17" (UID: "01e4dc90-6790-447b-ac2a-d2dfcde88d17"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.296324 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/811ef770-3be6-4f3b-9fc3-dee4df710c4f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "811ef770-3be6-4f3b-9fc3-dee4df710c4f" (UID: "811ef770-3be6-4f3b-9fc3-dee4df710c4f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.296529 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d27c011-b8dd-4f14-9833-413f7a8faf8a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d27c011-b8dd-4f14-9833-413f7a8faf8a" (UID: "9d27c011-b8dd-4f14-9833-413f7a8faf8a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.299387 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d27c011-b8dd-4f14-9833-413f7a8faf8a-kube-api-access-rst47" (OuterVolumeSpecName: "kube-api-access-rst47") pod "9d27c011-b8dd-4f14-9833-413f7a8faf8a" (UID: "9d27c011-b8dd-4f14-9833-413f7a8faf8a"). InnerVolumeSpecName "kube-api-access-rst47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.299484 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/811ef770-3be6-4f3b-9fc3-dee4df710c4f-kube-api-access-c9pfh" (OuterVolumeSpecName: "kube-api-access-c9pfh") pod "811ef770-3be6-4f3b-9fc3-dee4df710c4f" (UID: "811ef770-3be6-4f3b-9fc3-dee4df710c4f"). InnerVolumeSpecName "kube-api-access-c9pfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.300415 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01e4dc90-6790-447b-ac2a-d2dfcde88d17-kube-api-access-lpt8k" (OuterVolumeSpecName: "kube-api-access-lpt8k") pod "01e4dc90-6790-447b-ac2a-d2dfcde88d17" (UID: "01e4dc90-6790-447b-ac2a-d2dfcde88d17"). InnerVolumeSpecName "kube-api-access-lpt8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.395397 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rst47\" (UniqueName: \"kubernetes.io/projected/9d27c011-b8dd-4f14-9833-413f7a8faf8a-kube-api-access-rst47\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.395446 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d27c011-b8dd-4f14-9833-413f7a8faf8a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.395464 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811ef770-3be6-4f3b-9fc3-dee4df710c4f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.395477 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9pfh\" (UniqueName: \"kubernetes.io/projected/811ef770-3be6-4f3b-9fc3-dee4df710c4f-kube-api-access-c9pfh\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.395490 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpt8k\" (UniqueName: \"kubernetes.io/projected/01e4dc90-6790-447b-ac2a-d2dfcde88d17-kube-api-access-lpt8k\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.395507 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01e4dc90-6790-447b-ac2a-d2dfcde88d17-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.572191 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-594d-account-create-update-dxsw5" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.579065 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0f8b-account-create-update-mxbcn" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.583433 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-594d-account-create-update-dxsw5" event={"ID":"01e4dc90-6790-447b-ac2a-d2dfcde88d17","Type":"ContainerDied","Data":"6db0dcbeaa0e445d9563d80bdb4ee91587520750c60db46e9ceacf66f6d60c54"} Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.583487 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6db0dcbeaa0e445d9563d80bdb4ee91587520750c60db46e9ceacf66f6d60c54" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.583500 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0f8b-account-create-update-mxbcn" event={"ID":"9d27c011-b8dd-4f14-9833-413f7a8faf8a","Type":"ContainerDied","Data":"1f18354cc3ede52419192c5d865dc6321532287310adb5cffdafe9ba08f30940"} Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.583518 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f18354cc3ede52419192c5d865dc6321532287310adb5cffdafe9ba08f30940" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.586222 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kn2lh" event={"ID":"7d681487-9af9-48e3-bb79-569b8c7bf26d","Type":"ContainerDied","Data":"cb35dabb9f5f0edd526374363673fcbd6955aa3235f600a648fe50273b0796dd"} Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.586266 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb35dabb9f5f0edd526374363673fcbd6955aa3235f600a648fe50273b0796dd" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.586461 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kn2lh" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.593372 4764 generic.go:334] "Generic (PLEG): container finished" podID="16623a65-1bef-4faa-a891-bae0a7d04977" containerID="e53ea6806faef5cb682bac2b7668fda78ebb3a7b30791520082d7ab2a13f5aa0" exitCode=0 Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.593496 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551060-n7f54" event={"ID":"16623a65-1bef-4faa-a891-bae0a7d04977","Type":"ContainerDied","Data":"e53ea6806faef5cb682bac2b7668fda78ebb3a7b30791520082d7ab2a13f5aa0"} Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.595215 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-66ln9" event={"ID":"811ef770-3be6-4f3b-9fc3-dee4df710c4f","Type":"ContainerDied","Data":"616b45c0cfdfa8c2edf2207afc781be5cb32d60b419fa2c0f9c02788b1b97cc1"} Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.595256 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="616b45c0cfdfa8c2edf2207afc781be5cb32d60b419fa2c0f9c02788b1b97cc1" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.595322 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-66ln9" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.524041 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-rvfs9"] Mar 09 13:40:04 crc kubenswrapper[4764]: E0309 13:40:04.524700 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d27c011-b8dd-4f14-9833-413f7a8faf8a" containerName="mariadb-account-create-update" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.524716 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d27c011-b8dd-4f14-9833-413f7a8faf8a" containerName="mariadb-account-create-update" Mar 09 13:40:04 crc kubenswrapper[4764]: E0309 13:40:04.524739 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f29150-3689-48a6-9248-b6774f85fcd2" containerName="mariadb-database-create" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.524745 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f29150-3689-48a6-9248-b6774f85fcd2" containerName="mariadb-database-create" Mar 09 13:40:04 crc kubenswrapper[4764]: E0309 13:40:04.524760 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d681487-9af9-48e3-bb79-569b8c7bf26d" containerName="mariadb-database-create" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.524766 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d681487-9af9-48e3-bb79-569b8c7bf26d" containerName="mariadb-database-create" Mar 09 13:40:04 crc kubenswrapper[4764]: E0309 13:40:04.524779 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e4dc90-6790-447b-ac2a-d2dfcde88d17" containerName="mariadb-account-create-update" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.524786 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e4dc90-6790-447b-ac2a-d2dfcde88d17" containerName="mariadb-account-create-update" Mar 09 13:40:04 crc kubenswrapper[4764]: E0309 13:40:04.524797 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="811ef770-3be6-4f3b-9fc3-dee4df710c4f" containerName="mariadb-database-create" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.524803 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="811ef770-3be6-4f3b-9fc3-dee4df710c4f" containerName="mariadb-database-create" Mar 09 13:40:04 crc kubenswrapper[4764]: E0309 13:40:04.524814 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693ba99b-99d0-4b09-9f49-9deefe05abac" containerName="mariadb-account-create-update" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.524820 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="693ba99b-99d0-4b09-9f49-9deefe05abac" containerName="mariadb-account-create-update" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.524979 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d27c011-b8dd-4f14-9833-413f7a8faf8a" containerName="mariadb-account-create-update" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.524990 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="811ef770-3be6-4f3b-9fc3-dee4df710c4f" containerName="mariadb-database-create" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.524999 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e4dc90-6790-447b-ac2a-d2dfcde88d17" containerName="mariadb-account-create-update" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.525007 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d681487-9af9-48e3-bb79-569b8c7bf26d" containerName="mariadb-database-create" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.525018 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="75f29150-3689-48a6-9248-b6774f85fcd2" containerName="mariadb-database-create" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.525029 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="693ba99b-99d0-4b09-9f49-9deefe05abac" containerName="mariadb-account-create-update" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.525596 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rvfs9" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.534610 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rvfs9"] Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.534972 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.618918 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b28dc3f9-47a1-436c-865c-4d98e6ba960c-operator-scripts\") pod \"root-account-create-update-rvfs9\" (UID: \"b28dc3f9-47a1-436c-865c-4d98e6ba960c\") " pod="openstack/root-account-create-update-rvfs9" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.619442 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ftcz\" (UniqueName: \"kubernetes.io/projected/b28dc3f9-47a1-436c-865c-4d98e6ba960c-kube-api-access-6ftcz\") pod \"root-account-create-update-rvfs9\" (UID: \"b28dc3f9-47a1-436c-865c-4d98e6ba960c\") " pod="openstack/root-account-create-update-rvfs9" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.723072 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ftcz\" (UniqueName: \"kubernetes.io/projected/b28dc3f9-47a1-436c-865c-4d98e6ba960c-kube-api-access-6ftcz\") pod \"root-account-create-update-rvfs9\" (UID: \"b28dc3f9-47a1-436c-865c-4d98e6ba960c\") " pod="openstack/root-account-create-update-rvfs9" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.723717 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b28dc3f9-47a1-436c-865c-4d98e6ba960c-operator-scripts\") pod \"root-account-create-update-rvfs9\" (UID: \"b28dc3f9-47a1-436c-865c-4d98e6ba960c\") " pod="openstack/root-account-create-update-rvfs9" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.724899 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b28dc3f9-47a1-436c-865c-4d98e6ba960c-operator-scripts\") pod \"root-account-create-update-rvfs9\" (UID: \"b28dc3f9-47a1-436c-865c-4d98e6ba960c\") " pod="openstack/root-account-create-update-rvfs9" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.746412 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ftcz\" (UniqueName: \"kubernetes.io/projected/b28dc3f9-47a1-436c-865c-4d98e6ba960c-kube-api-access-6ftcz\") pod \"root-account-create-update-rvfs9\" (UID: \"b28dc3f9-47a1-436c-865c-4d98e6ba960c\") " pod="openstack/root-account-create-update-rvfs9" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.854133 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rvfs9" Mar 09 13:40:05 crc kubenswrapper[4764]: I0309 13:40:05.000417 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551060-n7f54" Mar 09 13:40:05 crc kubenswrapper[4764]: I0309 13:40:05.134203 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb5hw\" (UniqueName: \"kubernetes.io/projected/16623a65-1bef-4faa-a891-bae0a7d04977-kube-api-access-gb5hw\") pod \"16623a65-1bef-4faa-a891-bae0a7d04977\" (UID: \"16623a65-1bef-4faa-a891-bae0a7d04977\") " Mar 09 13:40:05 crc kubenswrapper[4764]: I0309 13:40:05.152943 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16623a65-1bef-4faa-a891-bae0a7d04977-kube-api-access-gb5hw" (OuterVolumeSpecName: "kube-api-access-gb5hw") pod "16623a65-1bef-4faa-a891-bae0a7d04977" (UID: "16623a65-1bef-4faa-a891-bae0a7d04977"). InnerVolumeSpecName "kube-api-access-gb5hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:05 crc kubenswrapper[4764]: I0309 13:40:05.237116 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb5hw\" (UniqueName: \"kubernetes.io/projected/16623a65-1bef-4faa-a891-bae0a7d04977-kube-api-access-gb5hw\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:05 crc kubenswrapper[4764]: I0309 13:40:05.336717 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rvfs9"] Mar 09 13:40:05 crc kubenswrapper[4764]: W0309 13:40:05.336807 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb28dc3f9_47a1_436c_865c_4d98e6ba960c.slice/crio-c8e1b592a881fc9ae62941eafc71a37253c86e963d35cd07e1fd548914aa0be6 WatchSource:0}: Error finding container c8e1b592a881fc9ae62941eafc71a37253c86e963d35cd07e1fd548914aa0be6: Status 404 returned error can't find the container with id c8e1b592a881fc9ae62941eafc71a37253c86e963d35cd07e1fd548914aa0be6 Mar 09 13:40:05 crc kubenswrapper[4764]: I0309 13:40:05.612983 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rvfs9" event={"ID":"b28dc3f9-47a1-436c-865c-4d98e6ba960c","Type":"ContainerStarted","Data":"c8e1b592a881fc9ae62941eafc71a37253c86e963d35cd07e1fd548914aa0be6"} Mar 09 13:40:05 crc kubenswrapper[4764]: I0309 13:40:05.614701 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551060-n7f54" event={"ID":"16623a65-1bef-4faa-a891-bae0a7d04977","Type":"ContainerDied","Data":"7002d9bc70f8ed4dbd1fb5ae2c17202e34dbcfe2e8bc11fc9b82eda847bd6796"} Mar 09 13:40:05 crc kubenswrapper[4764]: I0309 13:40:05.614732 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7002d9bc70f8ed4dbd1fb5ae2c17202e34dbcfe2e8bc11fc9b82eda847bd6796" Mar 09 13:40:05 crc kubenswrapper[4764]: I0309 13:40:05.614815 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551060-n7f54" Mar 09 13:40:06 crc kubenswrapper[4764]: I0309 13:40:06.101094 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551054-n54vp"] Mar 09 13:40:06 crc kubenswrapper[4764]: I0309 13:40:06.110234 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551054-n54vp"] Mar 09 13:40:06 crc kubenswrapper[4764]: I0309 13:40:06.627233 4764 generic.go:334] "Generic (PLEG): container finished" podID="b28dc3f9-47a1-436c-865c-4d98e6ba960c" containerID="e74ba0435a2089e000c8df414a7a0d71d67c7b4a00cc240811cbefae67ccd184" exitCode=0 Mar 09 13:40:06 crc kubenswrapper[4764]: I0309 13:40:06.627284 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rvfs9" event={"ID":"b28dc3f9-47a1-436c-865c-4d98e6ba960c","Type":"ContainerDied","Data":"e74ba0435a2089e000c8df414a7a0d71d67c7b4a00cc240811cbefae67ccd184"} Mar 09 13:40:07 crc kubenswrapper[4764]: I0309 13:40:07.571778 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="034371f5-4d6d-4a44-9678-9093ffaf3f9d" path="/var/lib/kubelet/pods/034371f5-4d6d-4a44-9678-9093ffaf3f9d/volumes" Mar 09 13:40:07 crc kubenswrapper[4764]: I0309 13:40:07.969739 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rvfs9" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.101842 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ftcz\" (UniqueName: \"kubernetes.io/projected/b28dc3f9-47a1-436c-865c-4d98e6ba960c-kube-api-access-6ftcz\") pod \"b28dc3f9-47a1-436c-865c-4d98e6ba960c\" (UID: \"b28dc3f9-47a1-436c-865c-4d98e6ba960c\") " Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.101942 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b28dc3f9-47a1-436c-865c-4d98e6ba960c-operator-scripts\") pod \"b28dc3f9-47a1-436c-865c-4d98e6ba960c\" (UID: \"b28dc3f9-47a1-436c-865c-4d98e6ba960c\") " Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.107826 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b28dc3f9-47a1-436c-865c-4d98e6ba960c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b28dc3f9-47a1-436c-865c-4d98e6ba960c" (UID: "b28dc3f9-47a1-436c-865c-4d98e6ba960c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.110405 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b28dc3f9-47a1-436c-865c-4d98e6ba960c-kube-api-access-6ftcz" (OuterVolumeSpecName: "kube-api-access-6ftcz") pod "b28dc3f9-47a1-436c-865c-4d98e6ba960c" (UID: "b28dc3f9-47a1-436c-865c-4d98e6ba960c"). InnerVolumeSpecName "kube-api-access-6ftcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.204563 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ftcz\" (UniqueName: \"kubernetes.io/projected/b28dc3f9-47a1-436c-865c-4d98e6ba960c-kube-api-access-6ftcz\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.204602 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b28dc3f9-47a1-436c-865c-4d98e6ba960c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.251927 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-vzxr2"] Mar 09 13:40:08 crc kubenswrapper[4764]: E0309 13:40:08.252347 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b28dc3f9-47a1-436c-865c-4d98e6ba960c" containerName="mariadb-account-create-update" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.252374 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b28dc3f9-47a1-436c-865c-4d98e6ba960c" containerName="mariadb-account-create-update" Mar 09 13:40:08 crc kubenswrapper[4764]: E0309 13:40:08.252423 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16623a65-1bef-4faa-a891-bae0a7d04977" containerName="oc" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.252433 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="16623a65-1bef-4faa-a891-bae0a7d04977" containerName="oc" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.252630 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="16623a65-1bef-4faa-a891-bae0a7d04977" containerName="oc" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.252680 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b28dc3f9-47a1-436c-865c-4d98e6ba960c" containerName="mariadb-account-create-update" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.253339 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.255556 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8p7c7" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.256637 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.268334 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vzxr2"] Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.409353 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-db-sync-config-data\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.409417 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-config-data\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.409612 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-combined-ca-bundle\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.409682 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pflc\" (UniqueName: \"kubernetes.io/projected/29e20119-f7d3-4b10-82c3-afbfa462c831-kube-api-access-2pflc\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.511393 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-combined-ca-bundle\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.511445 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pflc\" (UniqueName: \"kubernetes.io/projected/29e20119-f7d3-4b10-82c3-afbfa462c831-kube-api-access-2pflc\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.511515 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-db-sync-config-data\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.511543 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-config-data\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.516157 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-config-data\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.517052 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-db-sync-config-data\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.518559 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-combined-ca-bundle\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.529325 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pflc\" (UniqueName: \"kubernetes.io/projected/29e20119-f7d3-4b10-82c3-afbfa462c831-kube-api-access-2pflc\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.573020 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.650831 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rvfs9" event={"ID":"b28dc3f9-47a1-436c-865c-4d98e6ba960c","Type":"ContainerDied","Data":"c8e1b592a881fc9ae62941eafc71a37253c86e963d35cd07e1fd548914aa0be6"} Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.650866 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8e1b592a881fc9ae62941eafc71a37253c86e963d35cd07e1fd548914aa0be6" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.650926 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rvfs9" Mar 09 13:40:09 crc kubenswrapper[4764]: I0309 13:40:09.151392 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vzxr2"] Mar 09 13:40:09 crc kubenswrapper[4764]: I0309 13:40:09.658612 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vzxr2" event={"ID":"29e20119-f7d3-4b10-82c3-afbfa462c831","Type":"ContainerStarted","Data":"bf52819fae166b95cbd52539da218f51f13c6084d42cea34252b0b03900eaee5"} Mar 09 13:40:10 crc kubenswrapper[4764]: I0309 13:40:10.942111 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-rvfs9"] Mar 09 13:40:10 crc kubenswrapper[4764]: I0309 13:40:10.948802 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-rvfs9"] Mar 09 13:40:11 crc kubenswrapper[4764]: I0309 13:40:11.571400 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b28dc3f9-47a1-436c-865c-4d98e6ba960c" path="/var/lib/kubelet/pods/b28dc3f9-47a1-436c-865c-4d98e6ba960c/volumes" Mar 09 13:40:12 crc kubenswrapper[4764]: I0309 13:40:12.685829 4764 generic.go:334] "Generic (PLEG): container finished" podID="11c8bf9f-a031-4e56-b1d7-49b407eabaf7" containerID="fd0a79b758702c401b2bbc87884a5f0ad37053c07ad4da0c2c69610d9e0509c2" exitCode=0 Mar 09 13:40:12 crc kubenswrapper[4764]: I0309 13:40:12.685910 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"11c8bf9f-a031-4e56-b1d7-49b407eabaf7","Type":"ContainerDied","Data":"fd0a79b758702c401b2bbc87884a5f0ad37053c07ad4da0c2c69610d9e0509c2"} Mar 09 13:40:12 crc kubenswrapper[4764]: I0309 13:40:12.692073 4764 generic.go:334] "Generic (PLEG): container finished" podID="507bcef1-e9ef-4eb1-85ce-358209b944bc" containerID="f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601" exitCode=0 Mar 09 13:40:12 crc kubenswrapper[4764]: I0309 13:40:12.692158 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"507bcef1-e9ef-4eb1-85ce-358209b944bc","Type":"ContainerDied","Data":"f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601"} Mar 09 13:40:13 crc kubenswrapper[4764]: I0309 13:40:13.153995 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 09 13:40:13 crc kubenswrapper[4764]: I0309 13:40:13.658595 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-qm7vs" podUID="9bbe03cf-76d5-440a-903f-50c382aa3a4e" containerName="ovn-controller" probeResult="failure" output=< Mar 09 13:40:13 crc kubenswrapper[4764]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 09 13:40:13 crc kubenswrapper[4764]: > Mar 09 13:40:13 crc kubenswrapper[4764]: I0309 13:40:13.666686 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:40:13 crc kubenswrapper[4764]: I0309 13:40:13.680926 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:40:13 crc kubenswrapper[4764]: I0309 13:40:13.704688 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"11c8bf9f-a031-4e56-b1d7-49b407eabaf7","Type":"ContainerStarted","Data":"c051454f6de8558a84ceb371da4edc9fd851c2e829bf18eb804c0c4e657ef0ee"} Mar 09 13:40:13 crc kubenswrapper[4764]: I0309 13:40:13.705959 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:40:13 crc kubenswrapper[4764]: I0309 13:40:13.715547 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"507bcef1-e9ef-4eb1-85ce-358209b944bc","Type":"ContainerStarted","Data":"bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828"} Mar 09 13:40:13 crc kubenswrapper[4764]: I0309 13:40:13.716074 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 09 13:40:13 crc kubenswrapper[4764]: I0309 13:40:13.734673 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=53.050281617 podStartE2EDuration="1m0.734628852s" podCreationTimestamp="2026-03-09 13:39:13 +0000 UTC" firstStartedPulling="2026-03-09 13:39:30.053943006 +0000 UTC m=+1125.304114924" lastFinishedPulling="2026-03-09 13:39:37.738290251 +0000 UTC m=+1132.988462159" observedRunningTime="2026-03-09 13:40:13.733423032 +0000 UTC m=+1168.983594970" watchObservedRunningTime="2026-03-09 13:40:13.734628852 +0000 UTC m=+1168.984800770" Mar 09 13:40:13 crc kubenswrapper[4764]: I0309 13:40:13.770334 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=52.988453527 podStartE2EDuration="1m0.770306306s" podCreationTimestamp="2026-03-09 13:39:13 +0000 UTC" firstStartedPulling="2026-03-09 13:39:30.259112606 +0000 UTC m=+1125.509284514" lastFinishedPulling="2026-03-09 13:39:38.040965385 +0000 UTC m=+1133.291137293" observedRunningTime="2026-03-09 13:40:13.764637344 +0000 UTC m=+1169.014809272" watchObservedRunningTime="2026-03-09 13:40:13.770306306 +0000 UTC m=+1169.020478214" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.030107 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-qm7vs-config-pddrl"] Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.031450 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.034253 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.056608 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qm7vs-config-pddrl"] Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.142991 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.143087 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-additional-scripts\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.143125 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgwlt\" (UniqueName: \"kubernetes.io/projected/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-kube-api-access-hgwlt\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.143181 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-scripts\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.143260 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-log-ovn\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.143282 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run-ovn\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.244802 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-scripts\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.244918 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-log-ovn\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.244944 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run-ovn\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.244997 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.245045 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-additional-scripts\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.245077 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgwlt\" (UniqueName: \"kubernetes.io/projected/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-kube-api-access-hgwlt\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.247899 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-scripts\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.247995 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.248006 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run-ovn\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.248130 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-log-ovn\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.248481 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-additional-scripts\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.274184 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgwlt\" (UniqueName: \"kubernetes.io/projected/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-kube-api-access-hgwlt\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.362728 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.829399 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qm7vs-config-pddrl"] Mar 09 13:40:14 crc kubenswrapper[4764]: W0309 13:40:14.839475 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3a4b4d4_cc62_4a80_91f9_fa0c2e98292c.slice/crio-222c901f3647541831b70903ff46ece77edb02628f24984141b48907d046ac11 WatchSource:0}: Error finding container 222c901f3647541831b70903ff46ece77edb02628f24984141b48907d046ac11: Status 404 returned error can't find the container with id 222c901f3647541831b70903ff46ece77edb02628f24984141b48907d046ac11 Mar 09 13:40:15 crc kubenswrapper[4764]: I0309 13:40:15.736138 4764 generic.go:334] "Generic (PLEG): container finished" podID="d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" containerID="f865dbb42b23c282654ed11e8388916b4f7e6868644331082fbaed39e3cbb723" exitCode=0 Mar 09 13:40:15 crc kubenswrapper[4764]: I0309 13:40:15.736734 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qm7vs-config-pddrl" event={"ID":"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c","Type":"ContainerDied","Data":"f865dbb42b23c282654ed11e8388916b4f7e6868644331082fbaed39e3cbb723"} Mar 09 13:40:15 crc kubenswrapper[4764]: I0309 13:40:15.736769 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qm7vs-config-pddrl" event={"ID":"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c","Type":"ContainerStarted","Data":"222c901f3647541831b70903ff46ece77edb02628f24984141b48907d046ac11"} Mar 09 13:40:15 crc kubenswrapper[4764]: I0309 13:40:15.962537 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-htpf8"] Mar 09 13:40:15 crc kubenswrapper[4764]: I0309 13:40:15.965244 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-htpf8" Mar 09 13:40:15 crc kubenswrapper[4764]: I0309 13:40:15.967795 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-htpf8"] Mar 09 13:40:15 crc kubenswrapper[4764]: I0309 13:40:15.970499 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 09 13:40:16 crc kubenswrapper[4764]: I0309 13:40:16.080249 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w89w5\" (UniqueName: \"kubernetes.io/projected/88e0f4c9-1553-4aca-83f2-e0461ddf062b-kube-api-access-w89w5\") pod \"root-account-create-update-htpf8\" (UID: \"88e0f4c9-1553-4aca-83f2-e0461ddf062b\") " pod="openstack/root-account-create-update-htpf8" Mar 09 13:40:16 crc kubenswrapper[4764]: I0309 13:40:16.080375 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88e0f4c9-1553-4aca-83f2-e0461ddf062b-operator-scripts\") pod \"root-account-create-update-htpf8\" (UID: \"88e0f4c9-1553-4aca-83f2-e0461ddf062b\") " pod="openstack/root-account-create-update-htpf8" Mar 09 13:40:16 crc kubenswrapper[4764]: I0309 13:40:16.183885 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w89w5\" (UniqueName: \"kubernetes.io/projected/88e0f4c9-1553-4aca-83f2-e0461ddf062b-kube-api-access-w89w5\") pod \"root-account-create-update-htpf8\" (UID: \"88e0f4c9-1553-4aca-83f2-e0461ddf062b\") " pod="openstack/root-account-create-update-htpf8" Mar 09 13:40:16 crc kubenswrapper[4764]: I0309 13:40:16.185383 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88e0f4c9-1553-4aca-83f2-e0461ddf062b-operator-scripts\") pod \"root-account-create-update-htpf8\" (UID: \"88e0f4c9-1553-4aca-83f2-e0461ddf062b\") " pod="openstack/root-account-create-update-htpf8" Mar 09 13:40:16 crc kubenswrapper[4764]: I0309 13:40:16.186452 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88e0f4c9-1553-4aca-83f2-e0461ddf062b-operator-scripts\") pod \"root-account-create-update-htpf8\" (UID: \"88e0f4c9-1553-4aca-83f2-e0461ddf062b\") " pod="openstack/root-account-create-update-htpf8" Mar 09 13:40:16 crc kubenswrapper[4764]: I0309 13:40:16.224295 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w89w5\" (UniqueName: \"kubernetes.io/projected/88e0f4c9-1553-4aca-83f2-e0461ddf062b-kube-api-access-w89w5\") pod \"root-account-create-update-htpf8\" (UID: \"88e0f4c9-1553-4aca-83f2-e0461ddf062b\") " pod="openstack/root-account-create-update-htpf8" Mar 09 13:40:16 crc kubenswrapper[4764]: I0309 13:40:16.289584 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-htpf8" Mar 09 13:40:18 crc kubenswrapper[4764]: I0309 13:40:18.661360 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-qm7vs" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.117212 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.301082 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgwlt\" (UniqueName: \"kubernetes.io/projected/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-kube-api-access-hgwlt\") pod \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.301505 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-scripts\") pod \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.301638 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-additional-scripts\") pod \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.301711 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run-ovn\") pod \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.301775 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run\") pod \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.301843 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-log-ovn\") pod \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.301923 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" (UID: "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.301997 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run" (OuterVolumeSpecName: "var-run") pod "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" (UID: "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.302077 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" (UID: "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.302716 4764 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.302751 4764 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.302764 4764 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.303401 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" (UID: "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.304072 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-scripts" (OuterVolumeSpecName: "scripts") pod "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" (UID: "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.305922 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-kube-api-access-hgwlt" (OuterVolumeSpecName: "kube-api-access-hgwlt") pod "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" (UID: "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c"). InnerVolumeSpecName "kube-api-access-hgwlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.404952 4764 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.405000 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgwlt\" (UniqueName: \"kubernetes.io/projected/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-kube-api-access-hgwlt\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.405015 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.536971 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-htpf8"] Mar 09 13:40:22 crc kubenswrapper[4764]: W0309 13:40:22.550445 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88e0f4c9_1553_4aca_83f2_e0461ddf062b.slice/crio-26051806488a9ea5ced1bc0738712a4f97689b2959961164a42a4d12b30ca188 WatchSource:0}: Error finding container 26051806488a9ea5ced1bc0738712a4f97689b2959961164a42a4d12b30ca188: Status 404 returned error can't find the container with id 26051806488a9ea5ced1bc0738712a4f97689b2959961164a42a4d12b30ca188 Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.797915 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vzxr2" event={"ID":"29e20119-f7d3-4b10-82c3-afbfa462c831","Type":"ContainerStarted","Data":"06f1bf05b0fa436ae59020308ffb3c9a4a73f8d30f844f8ee3828c34f9dbc702"} Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.800948 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.800998 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qm7vs-config-pddrl" event={"ID":"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c","Type":"ContainerDied","Data":"222c901f3647541831b70903ff46ece77edb02628f24984141b48907d046ac11"} Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.801050 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="222c901f3647541831b70903ff46ece77edb02628f24984141b48907d046ac11" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.804051 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-htpf8" event={"ID":"88e0f4c9-1553-4aca-83f2-e0461ddf062b","Type":"ContainerStarted","Data":"ccd695e4102eac8ad1bb7aec25a2c1252558ea5824f39855b788c7a3324a1c3b"} Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.804094 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-htpf8" event={"ID":"88e0f4c9-1553-4aca-83f2-e0461ddf062b","Type":"ContainerStarted","Data":"26051806488a9ea5ced1bc0738712a4f97689b2959961164a42a4d12b30ca188"} Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.822120 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-vzxr2" podStartSLOduration=1.8062364720000001 podStartE2EDuration="14.822093359s" podCreationTimestamp="2026-03-09 13:40:08 +0000 UTC" firstStartedPulling="2026-03-09 13:40:09.155374838 +0000 UTC m=+1164.405546746" lastFinishedPulling="2026-03-09 13:40:22.171231725 +0000 UTC m=+1177.421403633" observedRunningTime="2026-03-09 13:40:22.819462883 +0000 UTC m=+1178.069634791" watchObservedRunningTime="2026-03-09 13:40:22.822093359 +0000 UTC m=+1178.072265267" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.847062 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-htpf8" podStartSLOduration=7.847032414 podStartE2EDuration="7.847032414s" podCreationTimestamp="2026-03-09 13:40:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:22.840761967 +0000 UTC m=+1178.090933895" watchObservedRunningTime="2026-03-09 13:40:22.847032414 +0000 UTC m=+1178.097204342" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.251958 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-qm7vs-config-pddrl"] Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.260093 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-qm7vs-config-pddrl"] Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.355119 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-qm7vs-config-xgh6g"] Mar 09 13:40:23 crc kubenswrapper[4764]: E0309 13:40:23.355563 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" containerName="ovn-config" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.355589 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" containerName="ovn-config" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.355812 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" containerName="ovn-config" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.356577 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.359265 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.374514 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qm7vs-config-xgh6g"] Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.423390 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-additional-scripts\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.423445 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-879pf\" (UniqueName: \"kubernetes.io/projected/186bfdab-9518-4d38-9f43-a0eafa335ed9-kube-api-access-879pf\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.423667 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run-ovn\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.423786 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-scripts\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.423922 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.424044 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-log-ovn\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.525730 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-scripts\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.528402 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.528473 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-log-ovn\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.528549 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-additional-scripts\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.528572 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-879pf\" (UniqueName: \"kubernetes.io/projected/186bfdab-9518-4d38-9f43-a0eafa335ed9-kube-api-access-879pf\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.528741 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.528775 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run-ovn\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.528870 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-log-ovn\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.528959 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run-ovn\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.529201 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-scripts\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.529480 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-additional-scripts\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.555072 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-879pf\" (UniqueName: \"kubernetes.io/projected/186bfdab-9518-4d38-9f43-a0eafa335ed9-kube-api-access-879pf\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.572493 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" path="/var/lib/kubelet/pods/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c/volumes" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.676454 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.844215 4764 generic.go:334] "Generic (PLEG): container finished" podID="88e0f4c9-1553-4aca-83f2-e0461ddf062b" containerID="ccd695e4102eac8ad1bb7aec25a2c1252558ea5824f39855b788c7a3324a1c3b" exitCode=0 Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.845041 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-htpf8" event={"ID":"88e0f4c9-1553-4aca-83f2-e0461ddf062b","Type":"ContainerDied","Data":"ccd695e4102eac8ad1bb7aec25a2c1252558ea5824f39855b788c7a3324a1c3b"} Mar 09 13:40:24 crc kubenswrapper[4764]: W0309 13:40:24.197479 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod186bfdab_9518_4d38_9f43_a0eafa335ed9.slice/crio-182ebbef535a87278100e77191a38749232657d5601f48b86924346516887649 WatchSource:0}: Error finding container 182ebbef535a87278100e77191a38749232657d5601f48b86924346516887649: Status 404 returned error can't find the container with id 182ebbef535a87278100e77191a38749232657d5601f48b86924346516887649 Mar 09 13:40:24 crc kubenswrapper[4764]: I0309 13:40:24.198613 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qm7vs-config-xgh6g"] Mar 09 13:40:24 crc kubenswrapper[4764]: I0309 13:40:24.831963 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:40:24 crc kubenswrapper[4764]: I0309 13:40:24.857269 4764 generic.go:334] "Generic (PLEG): container finished" podID="186bfdab-9518-4d38-9f43-a0eafa335ed9" containerID="8959c8a0509c3231f91de89d62e7a7d8e52e391a4941e4498d6d53a3afa1efea" exitCode=0 Mar 09 13:40:24 crc kubenswrapper[4764]: I0309 13:40:24.857593 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qm7vs-config-xgh6g" event={"ID":"186bfdab-9518-4d38-9f43-a0eafa335ed9","Type":"ContainerDied","Data":"8959c8a0509c3231f91de89d62e7a7d8e52e391a4941e4498d6d53a3afa1efea"} Mar 09 13:40:24 crc kubenswrapper[4764]: I0309 13:40:24.857692 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qm7vs-config-xgh6g" event={"ID":"186bfdab-9518-4d38-9f43-a0eafa335ed9","Type":"ContainerStarted","Data":"182ebbef535a87278100e77191a38749232657d5601f48b86924346516887649"} Mar 09 13:40:24 crc kubenswrapper[4764]: I0309 13:40:24.909938 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 09 13:40:25 crc kubenswrapper[4764]: I0309 13:40:25.178042 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-htpf8" Mar 09 13:40:25 crc kubenswrapper[4764]: I0309 13:40:25.270214 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88e0f4c9-1553-4aca-83f2-e0461ddf062b-operator-scripts\") pod \"88e0f4c9-1553-4aca-83f2-e0461ddf062b\" (UID: \"88e0f4c9-1553-4aca-83f2-e0461ddf062b\") " Mar 09 13:40:25 crc kubenswrapper[4764]: I0309 13:40:25.270375 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w89w5\" (UniqueName: \"kubernetes.io/projected/88e0f4c9-1553-4aca-83f2-e0461ddf062b-kube-api-access-w89w5\") pod \"88e0f4c9-1553-4aca-83f2-e0461ddf062b\" (UID: \"88e0f4c9-1553-4aca-83f2-e0461ddf062b\") " Mar 09 13:40:25 crc kubenswrapper[4764]: I0309 13:40:25.272038 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88e0f4c9-1553-4aca-83f2-e0461ddf062b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "88e0f4c9-1553-4aca-83f2-e0461ddf062b" (UID: "88e0f4c9-1553-4aca-83f2-e0461ddf062b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:25 crc kubenswrapper[4764]: I0309 13:40:25.282490 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88e0f4c9-1553-4aca-83f2-e0461ddf062b-kube-api-access-w89w5" (OuterVolumeSpecName: "kube-api-access-w89w5") pod "88e0f4c9-1553-4aca-83f2-e0461ddf062b" (UID: "88e0f4c9-1553-4aca-83f2-e0461ddf062b"). InnerVolumeSpecName "kube-api-access-w89w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:25 crc kubenswrapper[4764]: I0309 13:40:25.372478 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w89w5\" (UniqueName: \"kubernetes.io/projected/88e0f4c9-1553-4aca-83f2-e0461ddf062b-kube-api-access-w89w5\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:25 crc kubenswrapper[4764]: I0309 13:40:25.372523 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88e0f4c9-1553-4aca-83f2-e0461ddf062b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:25 crc kubenswrapper[4764]: I0309 13:40:25.871021 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-htpf8" event={"ID":"88e0f4c9-1553-4aca-83f2-e0461ddf062b","Type":"ContainerDied","Data":"26051806488a9ea5ced1bc0738712a4f97689b2959961164a42a4d12b30ca188"} Mar 09 13:40:25 crc kubenswrapper[4764]: I0309 13:40:25.871105 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26051806488a9ea5ced1bc0738712a4f97689b2959961164a42a4d12b30ca188" Mar 09 13:40:25 crc kubenswrapper[4764]: I0309 13:40:25.871036 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-htpf8" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.209851 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.395614 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-additional-scripts\") pod \"186bfdab-9518-4d38-9f43-a0eafa335ed9\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.395835 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run\") pod \"186bfdab-9518-4d38-9f43-a0eafa335ed9\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.395911 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-log-ovn\") pod \"186bfdab-9518-4d38-9f43-a0eafa335ed9\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.395944 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-879pf\" (UniqueName: \"kubernetes.io/projected/186bfdab-9518-4d38-9f43-a0eafa335ed9-kube-api-access-879pf\") pod \"186bfdab-9518-4d38-9f43-a0eafa335ed9\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.396019 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-scripts\") pod \"186bfdab-9518-4d38-9f43-a0eafa335ed9\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.396041 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "186bfdab-9518-4d38-9f43-a0eafa335ed9" (UID: "186bfdab-9518-4d38-9f43-a0eafa335ed9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.396019 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run" (OuterVolumeSpecName: "var-run") pod "186bfdab-9518-4d38-9f43-a0eafa335ed9" (UID: "186bfdab-9518-4d38-9f43-a0eafa335ed9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.396128 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run-ovn\") pod \"186bfdab-9518-4d38-9f43-a0eafa335ed9\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.396290 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "186bfdab-9518-4d38-9f43-a0eafa335ed9" (UID: "186bfdab-9518-4d38-9f43-a0eafa335ed9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.396742 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "186bfdab-9518-4d38-9f43-a0eafa335ed9" (UID: "186bfdab-9518-4d38-9f43-a0eafa335ed9"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.396760 4764 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.396775 4764 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.396784 4764 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.397101 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-scripts" (OuterVolumeSpecName: "scripts") pod "186bfdab-9518-4d38-9f43-a0eafa335ed9" (UID: "186bfdab-9518-4d38-9f43-a0eafa335ed9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.401396 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/186bfdab-9518-4d38-9f43-a0eafa335ed9-kube-api-access-879pf" (OuterVolumeSpecName: "kube-api-access-879pf") pod "186bfdab-9518-4d38-9f43-a0eafa335ed9" (UID: "186bfdab-9518-4d38-9f43-a0eafa335ed9"). InnerVolumeSpecName "kube-api-access-879pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.498833 4764 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.498880 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-879pf\" (UniqueName: \"kubernetes.io/projected/186bfdab-9518-4d38-9f43-a0eafa335ed9-kube-api-access-879pf\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.498894 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.810323 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-mqv59"] Mar 09 13:40:26 crc kubenswrapper[4764]: E0309 13:40:26.810792 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e0f4c9-1553-4aca-83f2-e0461ddf062b" containerName="mariadb-account-create-update" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.810818 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e0f4c9-1553-4aca-83f2-e0461ddf062b" containerName="mariadb-account-create-update" Mar 09 13:40:26 crc kubenswrapper[4764]: E0309 13:40:26.810848 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="186bfdab-9518-4d38-9f43-a0eafa335ed9" containerName="ovn-config" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.810856 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="186bfdab-9518-4d38-9f43-a0eafa335ed9" containerName="ovn-config" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.811056 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="186bfdab-9518-4d38-9f43-a0eafa335ed9" containerName="ovn-config" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.811081 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="88e0f4c9-1553-4aca-83f2-e0461ddf062b" containerName="mariadb-account-create-update" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.811820 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mqv59" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.817846 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-mqv59"] Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.881600 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qm7vs-config-xgh6g" event={"ID":"186bfdab-9518-4d38-9f43-a0eafa335ed9","Type":"ContainerDied","Data":"182ebbef535a87278100e77191a38749232657d5601f48b86924346516887649"} Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.881662 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="182ebbef535a87278100e77191a38749232657d5601f48b86924346516887649" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.881686 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.906071 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46124175-b282-444f-8d9c-0397e35cf8ae-operator-scripts\") pod \"cinder-db-create-mqv59\" (UID: \"46124175-b282-444f-8d9c-0397e35cf8ae\") " pod="openstack/cinder-db-create-mqv59" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.906195 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zqq7\" (UniqueName: \"kubernetes.io/projected/46124175-b282-444f-8d9c-0397e35cf8ae-kube-api-access-6zqq7\") pod \"cinder-db-create-mqv59\" (UID: \"46124175-b282-444f-8d9c-0397e35cf8ae\") " pod="openstack/cinder-db-create-mqv59" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.942123 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-15a9-account-create-update-5s8tj"] Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.943296 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-15a9-account-create-update-5s8tj" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.945713 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.958288 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-15a9-account-create-update-5s8tj"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.003704 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-thhcb"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.005048 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-thhcb" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.010355 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46124175-b282-444f-8d9c-0397e35cf8ae-operator-scripts\") pod \"cinder-db-create-mqv59\" (UID: \"46124175-b282-444f-8d9c-0397e35cf8ae\") " pod="openstack/cinder-db-create-mqv59" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.010438 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zqq7\" (UniqueName: \"kubernetes.io/projected/46124175-b282-444f-8d9c-0397e35cf8ae-kube-api-access-6zqq7\") pod \"cinder-db-create-mqv59\" (UID: \"46124175-b282-444f-8d9c-0397e35cf8ae\") " pod="openstack/cinder-db-create-mqv59" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.011605 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46124175-b282-444f-8d9c-0397e35cf8ae-operator-scripts\") pod \"cinder-db-create-mqv59\" (UID: \"46124175-b282-444f-8d9c-0397e35cf8ae\") " pod="openstack/cinder-db-create-mqv59" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.016113 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-thhcb"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.038633 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zqq7\" (UniqueName: \"kubernetes.io/projected/46124175-b282-444f-8d9c-0397e35cf8ae-kube-api-access-6zqq7\") pod \"cinder-db-create-mqv59\" (UID: \"46124175-b282-444f-8d9c-0397e35cf8ae\") " pod="openstack/cinder-db-create-mqv59" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.112779 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppxms\" (UniqueName: \"kubernetes.io/projected/642b5df5-dec0-47cc-9595-02b254277452-kube-api-access-ppxms\") pod \"barbican-db-create-thhcb\" (UID: \"642b5df5-dec0-47cc-9595-02b254277452\") " pod="openstack/barbican-db-create-thhcb" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.113205 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-operator-scripts\") pod \"cinder-15a9-account-create-update-5s8tj\" (UID: \"82410bc0-aa4c-450d-8fbc-67cfb9dd615b\") " pod="openstack/cinder-15a9-account-create-update-5s8tj" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.113261 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbcrf\" (UniqueName: \"kubernetes.io/projected/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-kube-api-access-sbcrf\") pod \"cinder-15a9-account-create-update-5s8tj\" (UID: \"82410bc0-aa4c-450d-8fbc-67cfb9dd615b\") " pod="openstack/cinder-15a9-account-create-update-5s8tj" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.113297 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/642b5df5-dec0-47cc-9595-02b254277452-operator-scripts\") pod \"barbican-db-create-thhcb\" (UID: \"642b5df5-dec0-47cc-9595-02b254277452\") " pod="openstack/barbican-db-create-thhcb" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.133359 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mqv59" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.176980 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-rjx7v"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.178396 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.182290 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.182412 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.182679 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4hfql" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.183406 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.191848 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rjx7v"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.201731 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-gkf9g"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.202755 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gkf9g" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.219051 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/642b5df5-dec0-47cc-9595-02b254277452-operator-scripts\") pod \"barbican-db-create-thhcb\" (UID: \"642b5df5-dec0-47cc-9595-02b254277452\") " pod="openstack/barbican-db-create-thhcb" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.219158 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppxms\" (UniqueName: \"kubernetes.io/projected/642b5df5-dec0-47cc-9595-02b254277452-kube-api-access-ppxms\") pod \"barbican-db-create-thhcb\" (UID: \"642b5df5-dec0-47cc-9595-02b254277452\") " pod="openstack/barbican-db-create-thhcb" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.219211 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-operator-scripts\") pod \"cinder-15a9-account-create-update-5s8tj\" (UID: \"82410bc0-aa4c-450d-8fbc-67cfb9dd615b\") " pod="openstack/cinder-15a9-account-create-update-5s8tj" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.219250 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbcrf\" (UniqueName: \"kubernetes.io/projected/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-kube-api-access-sbcrf\") pod \"cinder-15a9-account-create-update-5s8tj\" (UID: \"82410bc0-aa4c-450d-8fbc-67cfb9dd615b\") " pod="openstack/cinder-15a9-account-create-update-5s8tj" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.220332 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-operator-scripts\") pod \"cinder-15a9-account-create-update-5s8tj\" (UID: \"82410bc0-aa4c-450d-8fbc-67cfb9dd615b\") " pod="openstack/cinder-15a9-account-create-update-5s8tj" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.220558 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/642b5df5-dec0-47cc-9595-02b254277452-operator-scripts\") pod \"barbican-db-create-thhcb\" (UID: \"642b5df5-dec0-47cc-9595-02b254277452\") " pod="openstack/barbican-db-create-thhcb" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.233448 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6d6f-account-create-update-qxm8j"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.234588 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6d6f-account-create-update-qxm8j" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.240790 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.249276 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbcrf\" (UniqueName: \"kubernetes.io/projected/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-kube-api-access-sbcrf\") pod \"cinder-15a9-account-create-update-5s8tj\" (UID: \"82410bc0-aa4c-450d-8fbc-67cfb9dd615b\") " pod="openstack/cinder-15a9-account-create-update-5s8tj" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.254422 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-gkf9g"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.259989 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppxms\" (UniqueName: \"kubernetes.io/projected/642b5df5-dec0-47cc-9595-02b254277452-kube-api-access-ppxms\") pod \"barbican-db-create-thhcb\" (UID: \"642b5df5-dec0-47cc-9595-02b254277452\") " pod="openstack/barbican-db-create-thhcb" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.276984 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-15a9-account-create-update-5s8tj" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.286721 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6d6f-account-create-update-qxm8j"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.326080 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-thhcb" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.327804 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bf72fda-56e5-427c-b2d0-8267613d8a9e-operator-scripts\") pod \"neutron-db-create-gkf9g\" (UID: \"1bf72fda-56e5-427c-b2d0-8267613d8a9e\") " pod="openstack/neutron-db-create-gkf9g" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.327867 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw7zq\" (UniqueName: \"kubernetes.io/projected/1bf72fda-56e5-427c-b2d0-8267613d8a9e-kube-api-access-gw7zq\") pod \"neutron-db-create-gkf9g\" (UID: \"1bf72fda-56e5-427c-b2d0-8267613d8a9e\") " pod="openstack/neutron-db-create-gkf9g" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.328097 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-config-data\") pod \"keystone-db-sync-rjx7v\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.328120 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-combined-ca-bundle\") pod \"keystone-db-sync-rjx7v\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.328165 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8bhv\" (UniqueName: \"kubernetes.io/projected/ea9388c2-526b-49ff-8f42-03ca66ae08dd-kube-api-access-q8bhv\") pod \"keystone-db-sync-rjx7v\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.376847 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-a166-account-create-update-kswwc"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.378337 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a166-account-create-update-kswwc" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.384328 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-qm7vs-config-xgh6g"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.395164 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.395228 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-qm7vs-config-xgh6g"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.395799 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a166-account-create-update-kswwc"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.429891 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-config-data\") pod \"keystone-db-sync-rjx7v\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.430307 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-combined-ca-bundle\") pod \"keystone-db-sync-rjx7v\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.430354 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8bhv\" (UniqueName: \"kubernetes.io/projected/ea9388c2-526b-49ff-8f42-03ca66ae08dd-kube-api-access-q8bhv\") pod \"keystone-db-sync-rjx7v\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.430399 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b86f9b8-6493-4a60-85b3-12057a6a8f65-operator-scripts\") pod \"barbican-6d6f-account-create-update-qxm8j\" (UID: \"5b86f9b8-6493-4a60-85b3-12057a6a8f65\") " pod="openstack/barbican-6d6f-account-create-update-qxm8j" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.430433 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsb9v\" (UniqueName: \"kubernetes.io/projected/5b86f9b8-6493-4a60-85b3-12057a6a8f65-kube-api-access-wsb9v\") pod \"barbican-6d6f-account-create-update-qxm8j\" (UID: \"5b86f9b8-6493-4a60-85b3-12057a6a8f65\") " pod="openstack/barbican-6d6f-account-create-update-qxm8j" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.430482 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bf72fda-56e5-427c-b2d0-8267613d8a9e-operator-scripts\") pod \"neutron-db-create-gkf9g\" (UID: \"1bf72fda-56e5-427c-b2d0-8267613d8a9e\") " pod="openstack/neutron-db-create-gkf9g" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.430508 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw7zq\" (UniqueName: \"kubernetes.io/projected/1bf72fda-56e5-427c-b2d0-8267613d8a9e-kube-api-access-gw7zq\") pod \"neutron-db-create-gkf9g\" (UID: \"1bf72fda-56e5-427c-b2d0-8267613d8a9e\") " pod="openstack/neutron-db-create-gkf9g" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.443141 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bf72fda-56e5-427c-b2d0-8267613d8a9e-operator-scripts\") pod \"neutron-db-create-gkf9g\" (UID: \"1bf72fda-56e5-427c-b2d0-8267613d8a9e\") " pod="openstack/neutron-db-create-gkf9g" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.444670 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-combined-ca-bundle\") pod \"keystone-db-sync-rjx7v\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.449401 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-config-data\") pod \"keystone-db-sync-rjx7v\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.467471 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw7zq\" (UniqueName: \"kubernetes.io/projected/1bf72fda-56e5-427c-b2d0-8267613d8a9e-kube-api-access-gw7zq\") pod \"neutron-db-create-gkf9g\" (UID: \"1bf72fda-56e5-427c-b2d0-8267613d8a9e\") " pod="openstack/neutron-db-create-gkf9g" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.471183 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8bhv\" (UniqueName: \"kubernetes.io/projected/ea9388c2-526b-49ff-8f42-03ca66ae08dd-kube-api-access-q8bhv\") pod \"keystone-db-sync-rjx7v\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.534518 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b86f9b8-6493-4a60-85b3-12057a6a8f65-operator-scripts\") pod \"barbican-6d6f-account-create-update-qxm8j\" (UID: \"5b86f9b8-6493-4a60-85b3-12057a6a8f65\") " pod="openstack/barbican-6d6f-account-create-update-qxm8j" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.534581 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsb9v\" (UniqueName: \"kubernetes.io/projected/5b86f9b8-6493-4a60-85b3-12057a6a8f65-kube-api-access-wsb9v\") pod \"barbican-6d6f-account-create-update-qxm8j\" (UID: \"5b86f9b8-6493-4a60-85b3-12057a6a8f65\") " pod="openstack/barbican-6d6f-account-create-update-qxm8j" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.534701 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tdsb\" (UniqueName: \"kubernetes.io/projected/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-kube-api-access-6tdsb\") pod \"neutron-a166-account-create-update-kswwc\" (UID: \"ad7d32c2-ffe4-43d5-8640-6219f863bc2a\") " pod="openstack/neutron-a166-account-create-update-kswwc" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.534767 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-operator-scripts\") pod \"neutron-a166-account-create-update-kswwc\" (UID: \"ad7d32c2-ffe4-43d5-8640-6219f863bc2a\") " pod="openstack/neutron-a166-account-create-update-kswwc" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.535661 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b86f9b8-6493-4a60-85b3-12057a6a8f65-operator-scripts\") pod \"barbican-6d6f-account-create-update-qxm8j\" (UID: \"5b86f9b8-6493-4a60-85b3-12057a6a8f65\") " pod="openstack/barbican-6d6f-account-create-update-qxm8j" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.563188 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsb9v\" (UniqueName: \"kubernetes.io/projected/5b86f9b8-6493-4a60-85b3-12057a6a8f65-kube-api-access-wsb9v\") pod \"barbican-6d6f-account-create-update-qxm8j\" (UID: \"5b86f9b8-6493-4a60-85b3-12057a6a8f65\") " pod="openstack/barbican-6d6f-account-create-update-qxm8j" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.584528 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.592950 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="186bfdab-9518-4d38-9f43-a0eafa335ed9" path="/var/lib/kubelet/pods/186bfdab-9518-4d38-9f43-a0eafa335ed9/volumes" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.605090 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6d6f-account-create-update-qxm8j" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.636760 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tdsb\" (UniqueName: \"kubernetes.io/projected/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-kube-api-access-6tdsb\") pod \"neutron-a166-account-create-update-kswwc\" (UID: \"ad7d32c2-ffe4-43d5-8640-6219f863bc2a\") " pod="openstack/neutron-a166-account-create-update-kswwc" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.636866 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-operator-scripts\") pod \"neutron-a166-account-create-update-kswwc\" (UID: \"ad7d32c2-ffe4-43d5-8640-6219f863bc2a\") " pod="openstack/neutron-a166-account-create-update-kswwc" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.637627 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-operator-scripts\") pod \"neutron-a166-account-create-update-kswwc\" (UID: \"ad7d32c2-ffe4-43d5-8640-6219f863bc2a\") " pod="openstack/neutron-a166-account-create-update-kswwc" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.638049 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gkf9g" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.676059 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tdsb\" (UniqueName: \"kubernetes.io/projected/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-kube-api-access-6tdsb\") pod \"neutron-a166-account-create-update-kswwc\" (UID: \"ad7d32c2-ffe4-43d5-8640-6219f863bc2a\") " pod="openstack/neutron-a166-account-create-update-kswwc" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.828804 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a166-account-create-update-kswwc" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.887336 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-15a9-account-create-update-5s8tj"] Mar 09 13:40:27 crc kubenswrapper[4764]: W0309 13:40:27.916819 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82410bc0_aa4c_450d_8fbc_67cfb9dd615b.slice/crio-0b8142b3cc805480a67a46802a0365da053e4e9868c09964599a921ca210d192 WatchSource:0}: Error finding container 0b8142b3cc805480a67a46802a0365da053e4e9868c09964599a921ca210d192: Status 404 returned error can't find the container with id 0b8142b3cc805480a67a46802a0365da053e4e9868c09964599a921ca210d192 Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.932980 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-mqv59"] Mar 09 13:40:28 crc kubenswrapper[4764]: W0309 13:40:28.222470 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod642b5df5_dec0_47cc_9595_02b254277452.slice/crio-d83672538bf04116631d7414d3375d19a02bf6520cb37838591c270b1cfadc5f WatchSource:0}: Error finding container d83672538bf04116631d7414d3375d19a02bf6520cb37838591c270b1cfadc5f: Status 404 returned error can't find the container with id d83672538bf04116631d7414d3375d19a02bf6520cb37838591c270b1cfadc5f Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.223287 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-thhcb"] Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.370513 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.370582 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.373862 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-gkf9g"] Mar 09 13:40:28 crc kubenswrapper[4764]: W0309 13:40:28.383972 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bf72fda_56e5_427c_b2d0_8267613d8a9e.slice/crio-e51ce3016afe58e70de7013784d516c067ad727cd5eede5c2da838d45dc7748c WatchSource:0}: Error finding container e51ce3016afe58e70de7013784d516c067ad727cd5eede5c2da838d45dc7748c: Status 404 returned error can't find the container with id e51ce3016afe58e70de7013784d516c067ad727cd5eede5c2da838d45dc7748c Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.934397 4764 generic.go:334] "Generic (PLEG): container finished" podID="82410bc0-aa4c-450d-8fbc-67cfb9dd615b" containerID="339bad8557439d7afa5c329a435cd35a995246fc604b25e4a8e250f1a7b99368" exitCode=0 Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.934630 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-15a9-account-create-update-5s8tj" event={"ID":"82410bc0-aa4c-450d-8fbc-67cfb9dd615b","Type":"ContainerDied","Data":"339bad8557439d7afa5c329a435cd35a995246fc604b25e4a8e250f1a7b99368"} Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.934931 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-15a9-account-create-update-5s8tj" event={"ID":"82410bc0-aa4c-450d-8fbc-67cfb9dd615b","Type":"ContainerStarted","Data":"0b8142b3cc805480a67a46802a0365da053e4e9868c09964599a921ca210d192"} Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.938468 4764 generic.go:334] "Generic (PLEG): container finished" podID="46124175-b282-444f-8d9c-0397e35cf8ae" containerID="da404af8a75d74966ac6aec712c910704da97b03d85b23af80911b87587e1ef7" exitCode=0 Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.938610 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mqv59" event={"ID":"46124175-b282-444f-8d9c-0397e35cf8ae","Type":"ContainerDied","Data":"da404af8a75d74966ac6aec712c910704da97b03d85b23af80911b87587e1ef7"} Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.938684 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mqv59" event={"ID":"46124175-b282-444f-8d9c-0397e35cf8ae","Type":"ContainerStarted","Data":"8e28314c4efa3469246836e0ce638aaaf4d3b302e3f0249c8ebd090715366479"} Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.944317 4764 generic.go:334] "Generic (PLEG): container finished" podID="642b5df5-dec0-47cc-9595-02b254277452" containerID="6b21bf7421d738d162c19d823aaa8d5749330a4586aa8ead5eb3f5fbb8ebcd4e" exitCode=0 Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.944407 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-thhcb" event={"ID":"642b5df5-dec0-47cc-9595-02b254277452","Type":"ContainerDied","Data":"6b21bf7421d738d162c19d823aaa8d5749330a4586aa8ead5eb3f5fbb8ebcd4e"} Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.944455 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-thhcb" event={"ID":"642b5df5-dec0-47cc-9595-02b254277452","Type":"ContainerStarted","Data":"d83672538bf04116631d7414d3375d19a02bf6520cb37838591c270b1cfadc5f"} Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.947184 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gkf9g" event={"ID":"1bf72fda-56e5-427c-b2d0-8267613d8a9e","Type":"ContainerStarted","Data":"8799a3258f6d77e33d1068648c90a3371654d54435ab51ff0aa268e01baf2e9b"} Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.947215 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gkf9g" event={"ID":"1bf72fda-56e5-427c-b2d0-8267613d8a9e","Type":"ContainerStarted","Data":"e51ce3016afe58e70de7013784d516c067ad727cd5eede5c2da838d45dc7748c"} Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.043410 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-gkf9g" podStartSLOduration=2.043393455 podStartE2EDuration="2.043393455s" podCreationTimestamp="2026-03-09 13:40:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:29.039755383 +0000 UTC m=+1184.289927291" watchObservedRunningTime="2026-03-09 13:40:29.043393455 +0000 UTC m=+1184.293565363" Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.230135 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rjx7v"] Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.239772 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6d6f-account-create-update-qxm8j"] Mar 09 13:40:29 crc kubenswrapper[4764]: W0309 13:40:29.263913 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b86f9b8_6493_4a60_85b3_12057a6a8f65.slice/crio-e0c6dbc0703a920e008f15365138493c9f2dff91ea4203cb37d8d4b845814dc6 WatchSource:0}: Error finding container e0c6dbc0703a920e008f15365138493c9f2dff91ea4203cb37d8d4b845814dc6: Status 404 returned error can't find the container with id e0c6dbc0703a920e008f15365138493c9f2dff91ea4203cb37d8d4b845814dc6 Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.318467 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a166-account-create-update-kswwc"] Mar 09 13:40:29 crc kubenswrapper[4764]: W0309 13:40:29.326720 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad7d32c2_ffe4_43d5_8640_6219f863bc2a.slice/crio-37c18bc0866de7085e3cc23a30681608a62529af59013b7f38c34f1a01bfb0af WatchSource:0}: Error finding container 37c18bc0866de7085e3cc23a30681608a62529af59013b7f38c34f1a01bfb0af: Status 404 returned error can't find the container with id 37c18bc0866de7085e3cc23a30681608a62529af59013b7f38c34f1a01bfb0af Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.961382 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rjx7v" event={"ID":"ea9388c2-526b-49ff-8f42-03ca66ae08dd","Type":"ContainerStarted","Data":"250d5ff9e06681bc763b0d0c4d24353835d78a5fc9756881a6c43044e79b8237"} Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.963427 4764 generic.go:334] "Generic (PLEG): container finished" podID="ad7d32c2-ffe4-43d5-8640-6219f863bc2a" containerID="cf71be9d097dd827ce8f90129691408cafc7024d1662f6cee61aed9d868f11b5" exitCode=0 Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.963545 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a166-account-create-update-kswwc" event={"ID":"ad7d32c2-ffe4-43d5-8640-6219f863bc2a","Type":"ContainerDied","Data":"cf71be9d097dd827ce8f90129691408cafc7024d1662f6cee61aed9d868f11b5"} Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.963753 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a166-account-create-update-kswwc" event={"ID":"ad7d32c2-ffe4-43d5-8640-6219f863bc2a","Type":"ContainerStarted","Data":"37c18bc0866de7085e3cc23a30681608a62529af59013b7f38c34f1a01bfb0af"} Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.966556 4764 generic.go:334] "Generic (PLEG): container finished" podID="5b86f9b8-6493-4a60-85b3-12057a6a8f65" containerID="63ff43316b127bbf8f65a169da3d3d1192d7c9af3a1493dff44991331dc6c723" exitCode=0 Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.966627 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6d6f-account-create-update-qxm8j" event={"ID":"5b86f9b8-6493-4a60-85b3-12057a6a8f65","Type":"ContainerDied","Data":"63ff43316b127bbf8f65a169da3d3d1192d7c9af3a1493dff44991331dc6c723"} Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.966671 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6d6f-account-create-update-qxm8j" event={"ID":"5b86f9b8-6493-4a60-85b3-12057a6a8f65","Type":"ContainerStarted","Data":"e0c6dbc0703a920e008f15365138493c9f2dff91ea4203cb37d8d4b845814dc6"} Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.970768 4764 generic.go:334] "Generic (PLEG): container finished" podID="1bf72fda-56e5-427c-b2d0-8267613d8a9e" containerID="8799a3258f6d77e33d1068648c90a3371654d54435ab51ff0aa268e01baf2e9b" exitCode=0 Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.970998 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gkf9g" event={"ID":"1bf72fda-56e5-427c-b2d0-8267613d8a9e","Type":"ContainerDied","Data":"8799a3258f6d77e33d1068648c90a3371654d54435ab51ff0aa268e01baf2e9b"} Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.452117 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-thhcb" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.462256 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mqv59" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.464970 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-15a9-account-create-update-5s8tj" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.605018 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-operator-scripts\") pod \"82410bc0-aa4c-450d-8fbc-67cfb9dd615b\" (UID: \"82410bc0-aa4c-450d-8fbc-67cfb9dd615b\") " Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.605264 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/642b5df5-dec0-47cc-9595-02b254277452-operator-scripts\") pod \"642b5df5-dec0-47cc-9595-02b254277452\" (UID: \"642b5df5-dec0-47cc-9595-02b254277452\") " Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.605326 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zqq7\" (UniqueName: \"kubernetes.io/projected/46124175-b282-444f-8d9c-0397e35cf8ae-kube-api-access-6zqq7\") pod \"46124175-b282-444f-8d9c-0397e35cf8ae\" (UID: \"46124175-b282-444f-8d9c-0397e35cf8ae\") " Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.605423 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbcrf\" (UniqueName: \"kubernetes.io/projected/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-kube-api-access-sbcrf\") pod \"82410bc0-aa4c-450d-8fbc-67cfb9dd615b\" (UID: \"82410bc0-aa4c-450d-8fbc-67cfb9dd615b\") " Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.605456 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppxms\" (UniqueName: \"kubernetes.io/projected/642b5df5-dec0-47cc-9595-02b254277452-kube-api-access-ppxms\") pod \"642b5df5-dec0-47cc-9595-02b254277452\" (UID: \"642b5df5-dec0-47cc-9595-02b254277452\") " Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.605506 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46124175-b282-444f-8d9c-0397e35cf8ae-operator-scripts\") pod \"46124175-b282-444f-8d9c-0397e35cf8ae\" (UID: \"46124175-b282-444f-8d9c-0397e35cf8ae\") " Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.606874 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46124175-b282-444f-8d9c-0397e35cf8ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46124175-b282-444f-8d9c-0397e35cf8ae" (UID: "46124175-b282-444f-8d9c-0397e35cf8ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.606942 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/642b5df5-dec0-47cc-9595-02b254277452-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "642b5df5-dec0-47cc-9595-02b254277452" (UID: "642b5df5-dec0-47cc-9595-02b254277452"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.607550 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "82410bc0-aa4c-450d-8fbc-67cfb9dd615b" (UID: "82410bc0-aa4c-450d-8fbc-67cfb9dd615b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.614579 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46124175-b282-444f-8d9c-0397e35cf8ae-kube-api-access-6zqq7" (OuterVolumeSpecName: "kube-api-access-6zqq7") pod "46124175-b282-444f-8d9c-0397e35cf8ae" (UID: "46124175-b282-444f-8d9c-0397e35cf8ae"). InnerVolumeSpecName "kube-api-access-6zqq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.615090 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/642b5df5-dec0-47cc-9595-02b254277452-kube-api-access-ppxms" (OuterVolumeSpecName: "kube-api-access-ppxms") pod "642b5df5-dec0-47cc-9595-02b254277452" (UID: "642b5df5-dec0-47cc-9595-02b254277452"). InnerVolumeSpecName "kube-api-access-ppxms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.620768 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-kube-api-access-sbcrf" (OuterVolumeSpecName: "kube-api-access-sbcrf") pod "82410bc0-aa4c-450d-8fbc-67cfb9dd615b" (UID: "82410bc0-aa4c-450d-8fbc-67cfb9dd615b"). InnerVolumeSpecName "kube-api-access-sbcrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.710259 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/642b5df5-dec0-47cc-9595-02b254277452-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.710296 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zqq7\" (UniqueName: \"kubernetes.io/projected/46124175-b282-444f-8d9c-0397e35cf8ae-kube-api-access-6zqq7\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.710308 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbcrf\" (UniqueName: \"kubernetes.io/projected/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-kube-api-access-sbcrf\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.710317 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppxms\" (UniqueName: \"kubernetes.io/projected/642b5df5-dec0-47cc-9595-02b254277452-kube-api-access-ppxms\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.710330 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46124175-b282-444f-8d9c-0397e35cf8ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.710341 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.989403 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-thhcb" event={"ID":"642b5df5-dec0-47cc-9595-02b254277452","Type":"ContainerDied","Data":"d83672538bf04116631d7414d3375d19a02bf6520cb37838591c270b1cfadc5f"} Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.991804 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d83672538bf04116631d7414d3375d19a02bf6520cb37838591c270b1cfadc5f" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.989421 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-thhcb" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.992682 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-15a9-account-create-update-5s8tj" event={"ID":"82410bc0-aa4c-450d-8fbc-67cfb9dd615b","Type":"ContainerDied","Data":"0b8142b3cc805480a67a46802a0365da053e4e9868c09964599a921ca210d192"} Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.992715 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-15a9-account-create-update-5s8tj" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.992764 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b8142b3cc805480a67a46802a0365da053e4e9868c09964599a921ca210d192" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.995686 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mqv59" event={"ID":"46124175-b282-444f-8d9c-0397e35cf8ae","Type":"ContainerDied","Data":"8e28314c4efa3469246836e0ce638aaaf4d3b302e3f0249c8ebd090715366479"} Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.995733 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e28314c4efa3469246836e0ce638aaaf4d3b302e3f0249c8ebd090715366479" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.995754 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mqv59" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.378509 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gkf9g" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.386676 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a166-account-create-update-kswwc" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.409324 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6d6f-account-create-update-qxm8j" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.531451 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b86f9b8-6493-4a60-85b3-12057a6a8f65-operator-scripts\") pod \"5b86f9b8-6493-4a60-85b3-12057a6a8f65\" (UID: \"5b86f9b8-6493-4a60-85b3-12057a6a8f65\") " Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.531541 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsb9v\" (UniqueName: \"kubernetes.io/projected/5b86f9b8-6493-4a60-85b3-12057a6a8f65-kube-api-access-wsb9v\") pod \"5b86f9b8-6493-4a60-85b3-12057a6a8f65\" (UID: \"5b86f9b8-6493-4a60-85b3-12057a6a8f65\") " Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.531618 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tdsb\" (UniqueName: \"kubernetes.io/projected/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-kube-api-access-6tdsb\") pod \"ad7d32c2-ffe4-43d5-8640-6219f863bc2a\" (UID: \"ad7d32c2-ffe4-43d5-8640-6219f863bc2a\") " Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.531866 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bf72fda-56e5-427c-b2d0-8267613d8a9e-operator-scripts\") pod \"1bf72fda-56e5-427c-b2d0-8267613d8a9e\" (UID: \"1bf72fda-56e5-427c-b2d0-8267613d8a9e\") " Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.531913 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw7zq\" (UniqueName: \"kubernetes.io/projected/1bf72fda-56e5-427c-b2d0-8267613d8a9e-kube-api-access-gw7zq\") pod \"1bf72fda-56e5-427c-b2d0-8267613d8a9e\" (UID: \"1bf72fda-56e5-427c-b2d0-8267613d8a9e\") " Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.531997 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-operator-scripts\") pod \"ad7d32c2-ffe4-43d5-8640-6219f863bc2a\" (UID: \"ad7d32c2-ffe4-43d5-8640-6219f863bc2a\") " Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.532583 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b86f9b8-6493-4a60-85b3-12057a6a8f65-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b86f9b8-6493-4a60-85b3-12057a6a8f65" (UID: "5b86f9b8-6493-4a60-85b3-12057a6a8f65"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.532902 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b86f9b8-6493-4a60-85b3-12057a6a8f65-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.533005 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf72fda-56e5-427c-b2d0-8267613d8a9e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1bf72fda-56e5-427c-b2d0-8267613d8a9e" (UID: "1bf72fda-56e5-427c-b2d0-8267613d8a9e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.533703 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ad7d32c2-ffe4-43d5-8640-6219f863bc2a" (UID: "ad7d32c2-ffe4-43d5-8640-6219f863bc2a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.541978 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-kube-api-access-6tdsb" (OuterVolumeSpecName: "kube-api-access-6tdsb") pod "ad7d32c2-ffe4-43d5-8640-6219f863bc2a" (UID: "ad7d32c2-ffe4-43d5-8640-6219f863bc2a"). InnerVolumeSpecName "kube-api-access-6tdsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.545667 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf72fda-56e5-427c-b2d0-8267613d8a9e-kube-api-access-gw7zq" (OuterVolumeSpecName: "kube-api-access-gw7zq") pod "1bf72fda-56e5-427c-b2d0-8267613d8a9e" (UID: "1bf72fda-56e5-427c-b2d0-8267613d8a9e"). InnerVolumeSpecName "kube-api-access-gw7zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.545868 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b86f9b8-6493-4a60-85b3-12057a6a8f65-kube-api-access-wsb9v" (OuterVolumeSpecName: "kube-api-access-wsb9v") pod "5b86f9b8-6493-4a60-85b3-12057a6a8f65" (UID: "5b86f9b8-6493-4a60-85b3-12057a6a8f65"). InnerVolumeSpecName "kube-api-access-wsb9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.634768 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tdsb\" (UniqueName: \"kubernetes.io/projected/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-kube-api-access-6tdsb\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.634846 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bf72fda-56e5-427c-b2d0-8267613d8a9e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.634906 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw7zq\" (UniqueName: \"kubernetes.io/projected/1bf72fda-56e5-427c-b2d0-8267613d8a9e-kube-api-access-gw7zq\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.634924 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.634937 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsb9v\" (UniqueName: \"kubernetes.io/projected/5b86f9b8-6493-4a60-85b3-12057a6a8f65-kube-api-access-wsb9v\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.016696 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a166-account-create-update-kswwc" Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.016817 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a166-account-create-update-kswwc" event={"ID":"ad7d32c2-ffe4-43d5-8640-6219f863bc2a","Type":"ContainerDied","Data":"37c18bc0866de7085e3cc23a30681608a62529af59013b7f38c34f1a01bfb0af"} Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.016890 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37c18bc0866de7085e3cc23a30681608a62529af59013b7f38c34f1a01bfb0af" Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.026018 4764 generic.go:334] "Generic (PLEG): container finished" podID="29e20119-f7d3-4b10-82c3-afbfa462c831" containerID="06f1bf05b0fa436ae59020308ffb3c9a4a73f8d30f844f8ee3828c34f9dbc702" exitCode=0 Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.026210 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vzxr2" event={"ID":"29e20119-f7d3-4b10-82c3-afbfa462c831","Type":"ContainerDied","Data":"06f1bf05b0fa436ae59020308ffb3c9a4a73f8d30f844f8ee3828c34f9dbc702"} Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.030748 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6d6f-account-create-update-qxm8j" event={"ID":"5b86f9b8-6493-4a60-85b3-12057a6a8f65","Type":"ContainerDied","Data":"e0c6dbc0703a920e008f15365138493c9f2dff91ea4203cb37d8d4b845814dc6"} Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.030787 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0c6dbc0703a920e008f15365138493c9f2dff91ea4203cb37d8d4b845814dc6" Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.030867 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6d6f-account-create-update-qxm8j" Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.033491 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gkf9g" event={"ID":"1bf72fda-56e5-427c-b2d0-8267613d8a9e","Type":"ContainerDied","Data":"e51ce3016afe58e70de7013784d516c067ad727cd5eede5c2da838d45dc7748c"} Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.033516 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e51ce3016afe58e70de7013784d516c067ad727cd5eede5c2da838d45dc7748c" Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.033556 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gkf9g" Mar 09 13:40:34 crc kubenswrapper[4764]: I0309 13:40:34.494238 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:34 crc kubenswrapper[4764]: I0309 13:40:34.596531 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-config-data\") pod \"29e20119-f7d3-4b10-82c3-afbfa462c831\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " Mar 09 13:40:34 crc kubenswrapper[4764]: I0309 13:40:34.596652 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-db-sync-config-data\") pod \"29e20119-f7d3-4b10-82c3-afbfa462c831\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " Mar 09 13:40:34 crc kubenswrapper[4764]: I0309 13:40:34.596691 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-combined-ca-bundle\") pod \"29e20119-f7d3-4b10-82c3-afbfa462c831\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.061086 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "29e20119-f7d3-4b10-82c3-afbfa462c831" (UID: "29e20119-f7d3-4b10-82c3-afbfa462c831"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.067187 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pflc\" (UniqueName: \"kubernetes.io/projected/29e20119-f7d3-4b10-82c3-afbfa462c831-kube-api-access-2pflc\") pod \"29e20119-f7d3-4b10-82c3-afbfa462c831\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.067590 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-db-sync-config-data\") pod \"29e20119-f7d3-4b10-82c3-afbfa462c831\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " Mar 09 13:40:35 crc kubenswrapper[4764]: W0309 13:40:35.069085 4764 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/29e20119-f7d3-4b10-82c3-afbfa462c831/volumes/kubernetes.io~secret/db-sync-config-data Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.069117 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "29e20119-f7d3-4b10-82c3-afbfa462c831" (UID: "29e20119-f7d3-4b10-82c3-afbfa462c831"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.069493 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29e20119-f7d3-4b10-82c3-afbfa462c831" (UID: "29e20119-f7d3-4b10-82c3-afbfa462c831"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.074461 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29e20119-f7d3-4b10-82c3-afbfa462c831-kube-api-access-2pflc" (OuterVolumeSpecName: "kube-api-access-2pflc") pod "29e20119-f7d3-4b10-82c3-afbfa462c831" (UID: "29e20119-f7d3-4b10-82c3-afbfa462c831"). InnerVolumeSpecName "kube-api-access-2pflc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.077155 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pflc\" (UniqueName: \"kubernetes.io/projected/29e20119-f7d3-4b10-82c3-afbfa462c831-kube-api-access-2pflc\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.077198 4764 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.077220 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.086952 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vzxr2" event={"ID":"29e20119-f7d3-4b10-82c3-afbfa462c831","Type":"ContainerDied","Data":"bf52819fae166b95cbd52539da218f51f13c6084d42cea34252b0b03900eaee5"} Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.087036 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf52819fae166b95cbd52539da218f51f13c6084d42cea34252b0b03900eaee5" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.087554 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.093332 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-config-data" (OuterVolumeSpecName: "config-data") pod "29e20119-f7d3-4b10-82c3-afbfa462c831" (UID: "29e20119-f7d3-4b10-82c3-afbfa462c831"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.111991 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-rjx7v" podStartSLOduration=3.008077395 podStartE2EDuration="8.111971865s" podCreationTimestamp="2026-03-09 13:40:27 +0000 UTC" firstStartedPulling="2026-03-09 13:40:29.244579077 +0000 UTC m=+1184.494750985" lastFinishedPulling="2026-03-09 13:40:34.348473547 +0000 UTC m=+1189.598645455" observedRunningTime="2026-03-09 13:40:35.109653127 +0000 UTC m=+1190.359825055" watchObservedRunningTime="2026-03-09 13:40:35.111971865 +0000 UTC m=+1190.362143773" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.179471 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.922784 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-28h2r"] Mar 09 13:40:35 crc kubenswrapper[4764]: E0309 13:40:35.923269 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad7d32c2-ffe4-43d5-8640-6219f863bc2a" containerName="mariadb-account-create-update" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923288 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad7d32c2-ffe4-43d5-8640-6219f863bc2a" containerName="mariadb-account-create-update" Mar 09 13:40:35 crc kubenswrapper[4764]: E0309 13:40:35.923300 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46124175-b282-444f-8d9c-0397e35cf8ae" containerName="mariadb-database-create" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923308 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="46124175-b282-444f-8d9c-0397e35cf8ae" containerName="mariadb-database-create" Mar 09 13:40:35 crc kubenswrapper[4764]: E0309 13:40:35.923319 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="642b5df5-dec0-47cc-9595-02b254277452" containerName="mariadb-database-create" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923329 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="642b5df5-dec0-47cc-9595-02b254277452" containerName="mariadb-database-create" Mar 09 13:40:35 crc kubenswrapper[4764]: E0309 13:40:35.923345 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82410bc0-aa4c-450d-8fbc-67cfb9dd615b" containerName="mariadb-account-create-update" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923352 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="82410bc0-aa4c-450d-8fbc-67cfb9dd615b" containerName="mariadb-account-create-update" Mar 09 13:40:35 crc kubenswrapper[4764]: E0309 13:40:35.923365 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b86f9b8-6493-4a60-85b3-12057a6a8f65" containerName="mariadb-account-create-update" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923374 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b86f9b8-6493-4a60-85b3-12057a6a8f65" containerName="mariadb-account-create-update" Mar 09 13:40:35 crc kubenswrapper[4764]: E0309 13:40:35.923387 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e20119-f7d3-4b10-82c3-afbfa462c831" containerName="glance-db-sync" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923395 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e20119-f7d3-4b10-82c3-afbfa462c831" containerName="glance-db-sync" Mar 09 13:40:35 crc kubenswrapper[4764]: E0309 13:40:35.923416 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bf72fda-56e5-427c-b2d0-8267613d8a9e" containerName="mariadb-database-create" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923434 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bf72fda-56e5-427c-b2d0-8267613d8a9e" containerName="mariadb-database-create" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923585 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bf72fda-56e5-427c-b2d0-8267613d8a9e" containerName="mariadb-database-create" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923596 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b86f9b8-6493-4a60-85b3-12057a6a8f65" containerName="mariadb-account-create-update" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923606 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="29e20119-f7d3-4b10-82c3-afbfa462c831" containerName="glance-db-sync" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923616 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="46124175-b282-444f-8d9c-0397e35cf8ae" containerName="mariadb-database-create" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923627 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="82410bc0-aa4c-450d-8fbc-67cfb9dd615b" containerName="mariadb-account-create-update" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923635 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad7d32c2-ffe4-43d5-8640-6219f863bc2a" containerName="mariadb-account-create-update" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923668 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="642b5df5-dec0-47cc-9595-02b254277452" containerName="mariadb-database-create" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.924517 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.941570 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-28h2r"] Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.094044 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-nb\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.094423 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwv6l\" (UniqueName: \"kubernetes.io/projected/bb8bce39-6992-4785-a460-24d6def57630-kube-api-access-xwv6l\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.094493 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-config\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.094517 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-dns-svc\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.094543 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-sb\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.100342 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rjx7v" event={"ID":"ea9388c2-526b-49ff-8f42-03ca66ae08dd","Type":"ContainerStarted","Data":"bffbf528e6b05eccf8c0e302264ead1f4f679cc141d19e6aa19cb09ac29fba17"} Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.197131 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-nb\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.197209 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwv6l\" (UniqueName: \"kubernetes.io/projected/bb8bce39-6992-4785-a460-24d6def57630-kube-api-access-xwv6l\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.197324 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-config\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.197351 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-dns-svc\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.197383 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-sb\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.198443 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-sb\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.199009 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-config\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.199476 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-dns-svc\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.201674 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-nb\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.234725 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwv6l\" (UniqueName: \"kubernetes.io/projected/bb8bce39-6992-4785-a460-24d6def57630-kube-api-access-xwv6l\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.246911 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.720900 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-28h2r"] Mar 09 13:40:36 crc kubenswrapper[4764]: W0309 13:40:36.725326 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb8bce39_6992_4785_a460_24d6def57630.slice/crio-e0c540986c124380f8b3c90bbd614819fd8fb0c94ebb67116f54a09ce532fcfa WatchSource:0}: Error finding container e0c540986c124380f8b3c90bbd614819fd8fb0c94ebb67116f54a09ce532fcfa: Status 404 returned error can't find the container with id e0c540986c124380f8b3c90bbd614819fd8fb0c94ebb67116f54a09ce532fcfa Mar 09 13:40:37 crc kubenswrapper[4764]: I0309 13:40:37.112180 4764 generic.go:334] "Generic (PLEG): container finished" podID="bb8bce39-6992-4785-a460-24d6def57630" containerID="0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181" exitCode=0 Mar 09 13:40:37 crc kubenswrapper[4764]: I0309 13:40:37.112296 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" event={"ID":"bb8bce39-6992-4785-a460-24d6def57630","Type":"ContainerDied","Data":"0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181"} Mar 09 13:40:37 crc kubenswrapper[4764]: I0309 13:40:37.114700 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" event={"ID":"bb8bce39-6992-4785-a460-24d6def57630","Type":"ContainerStarted","Data":"e0c540986c124380f8b3c90bbd614819fd8fb0c94ebb67116f54a09ce532fcfa"} Mar 09 13:40:38 crc kubenswrapper[4764]: I0309 13:40:38.126287 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" event={"ID":"bb8bce39-6992-4785-a460-24d6def57630","Type":"ContainerStarted","Data":"645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e"} Mar 09 13:40:38 crc kubenswrapper[4764]: I0309 13:40:38.126757 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:38 crc kubenswrapper[4764]: I0309 13:40:38.149186 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" podStartSLOduration=3.149154181 podStartE2EDuration="3.149154181s" podCreationTimestamp="2026-03-09 13:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:38.143612372 +0000 UTC m=+1193.393784280" watchObservedRunningTime="2026-03-09 13:40:38.149154181 +0000 UTC m=+1193.399326089" Mar 09 13:40:39 crc kubenswrapper[4764]: I0309 13:40:39.137399 4764 generic.go:334] "Generic (PLEG): container finished" podID="ea9388c2-526b-49ff-8f42-03ca66ae08dd" containerID="bffbf528e6b05eccf8c0e302264ead1f4f679cc141d19e6aa19cb09ac29fba17" exitCode=0 Mar 09 13:40:39 crc kubenswrapper[4764]: I0309 13:40:39.137534 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rjx7v" event={"ID":"ea9388c2-526b-49ff-8f42-03ca66ae08dd","Type":"ContainerDied","Data":"bffbf528e6b05eccf8c0e302264ead1f4f679cc141d19e6aa19cb09ac29fba17"} Mar 09 13:40:40 crc kubenswrapper[4764]: I0309 13:40:40.499612 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:40 crc kubenswrapper[4764]: I0309 13:40:40.573531 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-config-data\") pod \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " Mar 09 13:40:40 crc kubenswrapper[4764]: I0309 13:40:40.573692 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-combined-ca-bundle\") pod \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " Mar 09 13:40:40 crc kubenswrapper[4764]: I0309 13:40:40.573727 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8bhv\" (UniqueName: \"kubernetes.io/projected/ea9388c2-526b-49ff-8f42-03ca66ae08dd-kube-api-access-q8bhv\") pod \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " Mar 09 13:40:40 crc kubenswrapper[4764]: I0309 13:40:40.580329 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea9388c2-526b-49ff-8f42-03ca66ae08dd-kube-api-access-q8bhv" (OuterVolumeSpecName: "kube-api-access-q8bhv") pod "ea9388c2-526b-49ff-8f42-03ca66ae08dd" (UID: "ea9388c2-526b-49ff-8f42-03ca66ae08dd"). InnerVolumeSpecName "kube-api-access-q8bhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:40 crc kubenswrapper[4764]: I0309 13:40:40.604620 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea9388c2-526b-49ff-8f42-03ca66ae08dd" (UID: "ea9388c2-526b-49ff-8f42-03ca66ae08dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:40 crc kubenswrapper[4764]: I0309 13:40:40.617798 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-config-data" (OuterVolumeSpecName: "config-data") pod "ea9388c2-526b-49ff-8f42-03ca66ae08dd" (UID: "ea9388c2-526b-49ff-8f42-03ca66ae08dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:40 crc kubenswrapper[4764]: I0309 13:40:40.681424 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:40 crc kubenswrapper[4764]: I0309 13:40:40.681465 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:40 crc kubenswrapper[4764]: I0309 13:40:40.681481 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8bhv\" (UniqueName: \"kubernetes.io/projected/ea9388c2-526b-49ff-8f42-03ca66ae08dd-kube-api-access-q8bhv\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.155737 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rjx7v" event={"ID":"ea9388c2-526b-49ff-8f42-03ca66ae08dd","Type":"ContainerDied","Data":"250d5ff9e06681bc763b0d0c4d24353835d78a5fc9756881a6c43044e79b8237"} Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.155782 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="250d5ff9e06681bc763b0d0c4d24353835d78a5fc9756881a6c43044e79b8237" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.156319 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.408125 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-28h2r"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.408418 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" podUID="bb8bce39-6992-4785-a460-24d6def57630" containerName="dnsmasq-dns" containerID="cri-o://645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e" gracePeriod=10 Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.455052 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67795cd9-n77tv"] Mar 09 13:40:41 crc kubenswrapper[4764]: E0309 13:40:41.455434 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea9388c2-526b-49ff-8f42-03ca66ae08dd" containerName="keystone-db-sync" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.455450 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea9388c2-526b-49ff-8f42-03ca66ae08dd" containerName="keystone-db-sync" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.455619 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea9388c2-526b-49ff-8f42-03ca66ae08dd" containerName="keystone-db-sync" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.456504 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.464022 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-mtsrr"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.465237 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.476063 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.476248 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.476274 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.476467 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.476814 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4hfql" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.495723 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-n77tv"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.533716 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mtsrr"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.629947 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-credential-keys\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.630545 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-sb\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.630848 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-config\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.630949 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-dns-svc\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.630987 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-combined-ca-bundle\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.631142 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-scripts\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.631215 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-nb\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.631233 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6nvs\" (UniqueName: \"kubernetes.io/projected/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-kube-api-access-t6nvs\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.631353 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs6wz\" (UniqueName: \"kubernetes.io/projected/2a710e46-50b7-4069-b15c-ee3d19bc06e0-kube-api-access-hs6wz\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.631402 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-config-data\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.631464 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-fernet-keys\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.658698 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-dp5x6"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.659920 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.669473 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.672502 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-4vt4n" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.672536 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.677247 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-dp5x6"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.716609 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.724763 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.734004 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.734279 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735342 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs6wz\" (UniqueName: \"kubernetes.io/projected/2a710e46-50b7-4069-b15c-ee3d19bc06e0-kube-api-access-hs6wz\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735395 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-config-data\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735422 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-fernet-keys\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735454 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-credential-keys\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735473 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-sb\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735492 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-config\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735528 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-dns-svc\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735549 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-combined-ca-bundle\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735606 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-scripts\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735637 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-nb\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735666 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6nvs\" (UniqueName: \"kubernetes.io/projected/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-kube-api-access-t6nvs\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.736975 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-sb\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.738112 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-config\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.739308 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-dns-svc\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.739867 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-nb\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.751096 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-fernet-keys\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.752258 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-scripts\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.767264 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-config-data\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.780095 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.783336 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-credential-keys\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.788107 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-combined-ca-bundle\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.798415 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs6wz\" (UniqueName: \"kubernetes.io/projected/2a710e46-50b7-4069-b15c-ee3d19bc06e0-kube-api-access-hs6wz\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.802111 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-cmhtp"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.803983 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.812533 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.812784 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-fn2ft" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.812944 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.827352 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-cmhtp"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.828093 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6nvs\" (UniqueName: \"kubernetes.io/projected/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-kube-api-access-t6nvs\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841552 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-config-data\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841706 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-scripts\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841754 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-log-httpd\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841772 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-run-httpd\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841799 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-config-data\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841820 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841842 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-scripts\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841874 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khssq\" (UniqueName: \"kubernetes.io/projected/74146b7d-9780-4d2d-9454-853296f88955-kube-api-access-khssq\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841911 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj2kg\" (UniqueName: \"kubernetes.io/projected/861cdd7d-b563-4009-9c33-a5c64d6ffae9-kube-api-access-mj2kg\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841938 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74146b7d-9780-4d2d-9454-853296f88955-etc-machine-id\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841953 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-db-sync-config-data\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841969 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-combined-ca-bundle\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.842027 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.874544 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.902198 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-x9gvc"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.904335 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.911572 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.911809 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-j8zg7" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.920822 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.940749 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-x9gvc"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.943769 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-scripts\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.943893 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-config\") pod \"neutron-db-sync-cmhtp\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.943937 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-log-httpd\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.943954 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-run-httpd\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.943978 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-config-data\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.943996 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.944037 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-scripts\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.944060 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khssq\" (UniqueName: \"kubernetes.io/projected/74146b7d-9780-4d2d-9454-853296f88955-kube-api-access-khssq\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.944094 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj2kg\" (UniqueName: \"kubernetes.io/projected/861cdd7d-b563-4009-9c33-a5c64d6ffae9-kube-api-access-mj2kg\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.944118 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74146b7d-9780-4d2d-9454-853296f88955-etc-machine-id\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.944146 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-db-sync-config-data\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.944162 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-combined-ca-bundle\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.944203 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-combined-ca-bundle\") pod \"neutron-db-sync-cmhtp\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.944224 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.944242 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lj9f\" (UniqueName: \"kubernetes.io/projected/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-kube-api-access-8lj9f\") pod \"neutron-db-sync-cmhtp\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.944264 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-config-data\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.946300 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-run-httpd\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.946753 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-log-httpd\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.950440 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-n77tv"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.950545 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74146b7d-9780-4d2d-9454-853296f88955-etc-machine-id\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.952109 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.960450 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-combined-ca-bundle\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.960847 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-db-sync-config-data\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.969035 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-config-data\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.973573 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-scripts\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.976220 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.980753 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-config-data\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.986211 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-scripts\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.009479 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khssq\" (UniqueName: \"kubernetes.io/projected/74146b7d-9780-4d2d-9454-853296f88955-kube-api-access-khssq\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.013625 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj2kg\" (UniqueName: \"kubernetes.io/projected/861cdd7d-b563-4009-9c33-a5c64d6ffae9-kube-api-access-mj2kg\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.020904 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-47q58"] Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.029342 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.039759 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-bnpcj"] Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.042469 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.045918 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jvrd\" (UniqueName: \"kubernetes.io/projected/cb54f57d-afb6-4e53-be9a-4b22573a9450-kube-api-access-7jvrd\") pod \"barbican-db-sync-x9gvc\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.045983 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-combined-ca-bundle\") pod \"neutron-db-sync-cmhtp\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.046008 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lj9f\" (UniqueName: \"kubernetes.io/projected/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-kube-api-access-8lj9f\") pod \"neutron-db-sync-cmhtp\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.046038 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-db-sync-config-data\") pod \"barbican-db-sync-x9gvc\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.046086 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-combined-ca-bundle\") pod \"barbican-db-sync-x9gvc\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.046120 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-config\") pod \"neutron-db-sync-cmhtp\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.048104 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.048276 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.048443 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-l4tqv" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.050332 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-combined-ca-bundle\") pod \"neutron-db-sync-cmhtp\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.052599 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-config\") pod \"neutron-db-sync-cmhtp\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.061202 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-47q58"] Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.083671 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lj9f\" (UniqueName: \"kubernetes.io/projected/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-kube-api-access-8lj9f\") pod \"neutron-db-sync-cmhtp\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.085604 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-bnpcj"] Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.126522 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.149103 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-scripts\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.149177 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-combined-ca-bundle\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.149217 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jvrd\" (UniqueName: \"kubernetes.io/projected/cb54f57d-afb6-4e53-be9a-4b22573a9450-kube-api-access-7jvrd\") pod \"barbican-db-sync-x9gvc\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.149257 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.149297 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-config-data\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.149323 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.149347 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-db-sync-config-data\") pod \"barbican-db-sync-x9gvc\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.159913 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-combined-ca-bundle\") pod \"barbican-db-sync-x9gvc\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.159997 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-config\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.160069 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1004910c-0db4-4e3d-aac5-358a557ee268-logs\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.160096 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9qxl\" (UniqueName: \"kubernetes.io/projected/1004910c-0db4-4e3d-aac5-358a557ee268-kube-api-access-v9qxl\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.160144 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.160160 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4km7\" (UniqueName: \"kubernetes.io/projected/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-kube-api-access-p4km7\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.166037 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-db-sync-config-data\") pod \"barbican-db-sync-x9gvc\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.169545 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-combined-ca-bundle\") pod \"barbican-db-sync-x9gvc\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.180574 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jvrd\" (UniqueName: \"kubernetes.io/projected/cb54f57d-afb6-4e53-be9a-4b22573a9450-kube-api-access-7jvrd\") pod \"barbican-db-sync-x9gvc\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.187258 4764 generic.go:334] "Generic (PLEG): container finished" podID="bb8bce39-6992-4785-a460-24d6def57630" containerID="645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e" exitCode=0 Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.187326 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" event={"ID":"bb8bce39-6992-4785-a460-24d6def57630","Type":"ContainerDied","Data":"645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e"} Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.187357 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" event={"ID":"bb8bce39-6992-4785-a460-24d6def57630","Type":"ContainerDied","Data":"e0c540986c124380f8b3c90bbd614819fd8fb0c94ebb67116f54a09ce532fcfa"} Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.187376 4764 scope.go:117] "RemoveContainer" containerID="645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.187590 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.188699 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.209030 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.243692 4764 scope.go:117] "RemoveContainer" containerID="0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.264685 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-nb\") pod \"bb8bce39-6992-4785-a460-24d6def57630\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.264785 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwv6l\" (UniqueName: \"kubernetes.io/projected/bb8bce39-6992-4785-a460-24d6def57630-kube-api-access-xwv6l\") pod \"bb8bce39-6992-4785-a460-24d6def57630\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.264878 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-sb\") pod \"bb8bce39-6992-4785-a460-24d6def57630\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.264993 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-config\") pod \"bb8bce39-6992-4785-a460-24d6def57630\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265049 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-dns-svc\") pod \"bb8bce39-6992-4785-a460-24d6def57630\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265389 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-config-data\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265422 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265485 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-config\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265517 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1004910c-0db4-4e3d-aac5-358a557ee268-logs\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265537 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9qxl\" (UniqueName: \"kubernetes.io/projected/1004910c-0db4-4e3d-aac5-358a557ee268-kube-api-access-v9qxl\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265557 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265575 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4km7\" (UniqueName: \"kubernetes.io/projected/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-kube-api-access-p4km7\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265596 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-scripts\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265620 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-combined-ca-bundle\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265694 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.266617 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.268677 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-config\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.268691 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1004910c-0db4-4e3d-aac5-358a557ee268-logs\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.269690 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.274523 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.291974 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb8bce39-6992-4785-a460-24d6def57630-kube-api-access-xwv6l" (OuterVolumeSpecName: "kube-api-access-xwv6l") pod "bb8bce39-6992-4785-a460-24d6def57630" (UID: "bb8bce39-6992-4785-a460-24d6def57630"). InnerVolumeSpecName "kube-api-access-xwv6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.292203 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-scripts\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.296707 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-config-data\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.297290 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4km7\" (UniqueName: \"kubernetes.io/projected/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-kube-api-access-p4km7\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.297765 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.300594 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9qxl\" (UniqueName: \"kubernetes.io/projected/1004910c-0db4-4e3d-aac5-358a557ee268-kube-api-access-v9qxl\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.302359 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-combined-ca-bundle\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.341335 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.345751 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb8bce39-6992-4785-a460-24d6def57630" (UID: "bb8bce39-6992-4785-a460-24d6def57630"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.346888 4764 scope.go:117] "RemoveContainer" containerID="645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e" Mar 09 13:40:42 crc kubenswrapper[4764]: E0309 13:40:42.347973 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e\": container with ID starting with 645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e not found: ID does not exist" containerID="645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.348022 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e"} err="failed to get container status \"645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e\": rpc error: code = NotFound desc = could not find container \"645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e\": container with ID starting with 645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e not found: ID does not exist" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.348051 4764 scope.go:117] "RemoveContainer" containerID="0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181" Mar 09 13:40:42 crc kubenswrapper[4764]: E0309 13:40:42.348408 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181\": container with ID starting with 0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181 not found: ID does not exist" containerID="0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.348432 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181"} err="failed to get container status \"0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181\": rpc error: code = NotFound desc = could not find container \"0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181\": container with ID starting with 0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181 not found: ID does not exist" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.367409 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.367569 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwv6l\" (UniqueName: \"kubernetes.io/projected/bb8bce39-6992-4785-a460-24d6def57630-kube-api-access-xwv6l\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.367403 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-config" (OuterVolumeSpecName: "config") pod "bb8bce39-6992-4785-a460-24d6def57630" (UID: "bb8bce39-6992-4785-a460-24d6def57630"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.376753 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.393372 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.403196 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb8bce39-6992-4785-a460-24d6def57630" (UID: "bb8bce39-6992-4785-a460-24d6def57630"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.404369 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb8bce39-6992-4785-a460-24d6def57630" (UID: "bb8bce39-6992-4785-a460-24d6def57630"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.468712 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.468743 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.468752 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.554686 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-28h2r"] Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.562016 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-28h2r"] Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.613666 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mtsrr"] Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.638656 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-n77tv"] Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.874726 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.065410 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-cmhtp"] Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.224173 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cmhtp" event={"ID":"34466abc-30eb-4a0c-b4ea-50b5ab368fa1","Type":"ContainerStarted","Data":"7aa42fe65ff3f50e964cbe57813dcd4f0b9049443a2784caec6019418fe3ac3d"} Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.247217 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mtsrr" event={"ID":"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62","Type":"ContainerStarted","Data":"14bfbdfc3fcd7dcb9efec055a62cdfeec5d1e0e90e453a55c55ffd01ca49ad5a"} Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.247266 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mtsrr" event={"ID":"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62","Type":"ContainerStarted","Data":"964dc46977581745072ef70168c2ac93d99beadaf259b69a6032dafaefc2a059"} Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.254307 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-dp5x6"] Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.283176 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"861cdd7d-b563-4009-9c33-a5c64d6ffae9","Type":"ContainerStarted","Data":"6eab7083503ac11b2f955fc7b67907a3eb60734c7262ad63357465f6782429c0"} Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.286964 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-x9gvc"] Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.299061 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-47q58"] Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.307520 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-mtsrr" podStartSLOduration=2.307490315 podStartE2EDuration="2.307490315s" podCreationTimestamp="2026-03-09 13:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:43.28536864 +0000 UTC m=+1198.535540548" watchObservedRunningTime="2026-03-09 13:40:43.307490315 +0000 UTC m=+1198.557662223" Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.325425 4764 generic.go:334] "Generic (PLEG): container finished" podID="2a710e46-50b7-4069-b15c-ee3d19bc06e0" containerID="550203bc7f1c151fba9e233eabd544aa66da7e46f4d007a4f7cc69ce9aa3acf6" exitCode=0 Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.325478 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67795cd9-n77tv" event={"ID":"2a710e46-50b7-4069-b15c-ee3d19bc06e0","Type":"ContainerDied","Data":"550203bc7f1c151fba9e233eabd544aa66da7e46f4d007a4f7cc69ce9aa3acf6"} Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.325507 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67795cd9-n77tv" event={"ID":"2a710e46-50b7-4069-b15c-ee3d19bc06e0","Type":"ContainerStarted","Data":"c4a9f43a5d65d0ed33c7939a176de965247fc3684b2222485a9ce263feb9c4e8"} Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.352941 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-bnpcj"] Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.627241 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb8bce39-6992-4785-a460-24d6def57630" path="/var/lib/kubelet/pods/bb8bce39-6992-4785-a460-24d6def57630/volumes" Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.825467 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.887332 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.909603 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-sb\") pod \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.909785 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs6wz\" (UniqueName: \"kubernetes.io/projected/2a710e46-50b7-4069-b15c-ee3d19bc06e0-kube-api-access-hs6wz\") pod \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.909830 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-config\") pod \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.909905 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-nb\") pod \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.909977 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-dns-svc\") pod \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.944842 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a710e46-50b7-4069-b15c-ee3d19bc06e0-kube-api-access-hs6wz" (OuterVolumeSpecName: "kube-api-access-hs6wz") pod "2a710e46-50b7-4069-b15c-ee3d19bc06e0" (UID: "2a710e46-50b7-4069-b15c-ee3d19bc06e0"). InnerVolumeSpecName "kube-api-access-hs6wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.961427 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2a710e46-50b7-4069-b15c-ee3d19bc06e0" (UID: "2a710e46-50b7-4069-b15c-ee3d19bc06e0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.989233 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2a710e46-50b7-4069-b15c-ee3d19bc06e0" (UID: "2a710e46-50b7-4069-b15c-ee3d19bc06e0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.017029 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.017065 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs6wz\" (UniqueName: \"kubernetes.io/projected/2a710e46-50b7-4069-b15c-ee3d19bc06e0-kube-api-access-hs6wz\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.017081 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.018492 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2a710e46-50b7-4069-b15c-ee3d19bc06e0" (UID: "2a710e46-50b7-4069-b15c-ee3d19bc06e0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.022578 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-config" (OuterVolumeSpecName: "config") pod "2a710e46-50b7-4069-b15c-ee3d19bc06e0" (UID: "2a710e46-50b7-4069-b15c-ee3d19bc06e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.119171 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.119204 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.354163 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67795cd9-n77tv" event={"ID":"2a710e46-50b7-4069-b15c-ee3d19bc06e0","Type":"ContainerDied","Data":"c4a9f43a5d65d0ed33c7939a176de965247fc3684b2222485a9ce263feb9c4e8"} Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.354211 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.354240 4764 scope.go:117] "RemoveContainer" containerID="550203bc7f1c151fba9e233eabd544aa66da7e46f4d007a4f7cc69ce9aa3acf6" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.356885 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cmhtp" event={"ID":"34466abc-30eb-4a0c-b4ea-50b5ab368fa1","Type":"ContainerStarted","Data":"77b865a9fe4889c82ec1f58b4a21addc379165d3a6897c87913c6042aa1f357b"} Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.360955 4764 generic.go:334] "Generic (PLEG): container finished" podID="f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" containerID="643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333" exitCode=0 Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.361007 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" event={"ID":"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1","Type":"ContainerDied","Data":"643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333"} Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.361025 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" event={"ID":"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1","Type":"ContainerStarted","Data":"29094e7809129e1e7698bb9912d54d76c17888d5bfac065889c5bd5838e4b71c"} Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.367476 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bnpcj" event={"ID":"1004910c-0db4-4e3d-aac5-358a557ee268","Type":"ContainerStarted","Data":"4a62e7bc40a3d288cb2716e66a0b772309aa68a835605c18dfc2a81767e2ae07"} Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.372358 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dp5x6" event={"ID":"74146b7d-9780-4d2d-9454-853296f88955","Type":"ContainerStarted","Data":"eedf3d5fa18d75dee38a49a615e9d2f0a831d8c6f13a33063d424dd9831c9a81"} Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.388166 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x9gvc" event={"ID":"cb54f57d-afb6-4e53-be9a-4b22573a9450","Type":"ContainerStarted","Data":"7611ab820e5631fb861e6ee00f8e6a6553e8f141cb68d0a4556a1a0beeb23d3b"} Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.408605 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-cmhtp" podStartSLOduration=3.408579764 podStartE2EDuration="3.408579764s" podCreationTimestamp="2026-03-09 13:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:44.383520116 +0000 UTC m=+1199.633692024" watchObservedRunningTime="2026-03-09 13:40:44.408579764 +0000 UTC m=+1199.658751662" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.448368 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-n77tv"] Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.457859 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-n77tv"] Mar 09 13:40:45 crc kubenswrapper[4764]: I0309 13:40:45.435069 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" event={"ID":"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1","Type":"ContainerStarted","Data":"1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22"} Mar 09 13:40:45 crc kubenswrapper[4764]: I0309 13:40:45.435489 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:45 crc kubenswrapper[4764]: I0309 13:40:45.474096 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" podStartSLOduration=4.47407667 podStartE2EDuration="4.47407667s" podCreationTimestamp="2026-03-09 13:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:45.466778937 +0000 UTC m=+1200.716950855" watchObservedRunningTime="2026-03-09 13:40:45.47407667 +0000 UTC m=+1200.724248578" Mar 09 13:40:45 crc kubenswrapper[4764]: I0309 13:40:45.595574 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a710e46-50b7-4069-b15c-ee3d19bc06e0" path="/var/lib/kubelet/pods/2a710e46-50b7-4069-b15c-ee3d19bc06e0/volumes" Mar 09 13:40:48 crc kubenswrapper[4764]: I0309 13:40:48.464913 4764 generic.go:334] "Generic (PLEG): container finished" podID="be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" containerID="14bfbdfc3fcd7dcb9efec055a62cdfeec5d1e0e90e453a55c55ffd01ca49ad5a" exitCode=0 Mar 09 13:40:48 crc kubenswrapper[4764]: I0309 13:40:48.465006 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mtsrr" event={"ID":"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62","Type":"ContainerDied","Data":"14bfbdfc3fcd7dcb9efec055a62cdfeec5d1e0e90e453a55c55ffd01ca49ad5a"} Mar 09 13:40:49 crc kubenswrapper[4764]: I0309 13:40:49.866049 4764 scope.go:117] "RemoveContainer" containerID="33a445a6adcd631bbd08283597c203355f22e04e83cf13fd98ba9c1f71c00bd6" Mar 09 13:40:51 crc kubenswrapper[4764]: I0309 13:40:51.951056 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.125519 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-combined-ca-bundle\") pod \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.125707 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6nvs\" (UniqueName: \"kubernetes.io/projected/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-kube-api-access-t6nvs\") pod \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.125740 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-credential-keys\") pod \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.127058 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-scripts\") pod \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.127194 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-config-data\") pod \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.127235 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-fernet-keys\") pod \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.134679 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-kube-api-access-t6nvs" (OuterVolumeSpecName: "kube-api-access-t6nvs") pod "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" (UID: "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62"). InnerVolumeSpecName "kube-api-access-t6nvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.137106 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" (UID: "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.137182 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-scripts" (OuterVolumeSpecName: "scripts") pod "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" (UID: "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.155420 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" (UID: "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.158390 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" (UID: "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.164206 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-config-data" (OuterVolumeSpecName: "config-data") pod "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" (UID: "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.231228 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.231273 4764 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.231284 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.231298 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6nvs\" (UniqueName: \"kubernetes.io/projected/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-kube-api-access-t6nvs\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.231309 4764 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.231318 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.378970 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.473715 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-p289h"] Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.474116 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-p289h" podUID="e921061b-2a0f-4b22-beb1-0d52993dc06b" containerName="dnsmasq-dns" containerID="cri-o://fce1b5bc8a63a55a1345ac17564049805c448dc5638e52c8bc6cdc086dbbb27d" gracePeriod=10 Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.527735 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mtsrr" event={"ID":"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62","Type":"ContainerDied","Data":"964dc46977581745072ef70168c2ac93d99beadaf259b69a6032dafaefc2a059"} Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.527786 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="964dc46977581745072ef70168c2ac93d99beadaf259b69a6032dafaefc2a059" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.527874 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.042306 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-mtsrr"] Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.049626 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-mtsrr"] Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.167637 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-9sj6m"] Mar 09 13:40:53 crc kubenswrapper[4764]: E0309 13:40:53.168045 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8bce39-6992-4785-a460-24d6def57630" containerName="init" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.168064 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8bce39-6992-4785-a460-24d6def57630" containerName="init" Mar 09 13:40:53 crc kubenswrapper[4764]: E0309 13:40:53.168082 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" containerName="keystone-bootstrap" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.168091 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" containerName="keystone-bootstrap" Mar 09 13:40:53 crc kubenswrapper[4764]: E0309 13:40:53.168114 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8bce39-6992-4785-a460-24d6def57630" containerName="dnsmasq-dns" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.168121 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8bce39-6992-4785-a460-24d6def57630" containerName="dnsmasq-dns" Mar 09 13:40:53 crc kubenswrapper[4764]: E0309 13:40:53.168135 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a710e46-50b7-4069-b15c-ee3d19bc06e0" containerName="init" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.168140 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a710e46-50b7-4069-b15c-ee3d19bc06e0" containerName="init" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.168294 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8bce39-6992-4785-a460-24d6def57630" containerName="dnsmasq-dns" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.168307 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a710e46-50b7-4069-b15c-ee3d19bc06e0" containerName="init" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.168326 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" containerName="keystone-bootstrap" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.168938 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.174616 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.174880 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.174931 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4hfql" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.175221 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.175253 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.192638 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9sj6m"] Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.270511 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-fernet-keys\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.270547 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-config-data\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.270570 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-combined-ca-bundle\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.270602 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-credential-keys\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.270637 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw9hw\" (UniqueName: \"kubernetes.io/projected/2a338463-1443-4863-830e-0621abc3ed15-kube-api-access-mw9hw\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.270693 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-scripts\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.371898 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-credential-keys\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.371973 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw9hw\" (UniqueName: \"kubernetes.io/projected/2a338463-1443-4863-830e-0621abc3ed15-kube-api-access-mw9hw\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.372026 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-scripts\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.372096 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-fernet-keys\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.372116 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-config-data\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.372135 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-combined-ca-bundle\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.377570 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-credential-keys\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.378776 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-fernet-keys\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.380390 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-scripts\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.383738 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-config-data\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.394512 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-combined-ca-bundle\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.401262 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw9hw\" (UniqueName: \"kubernetes.io/projected/2a338463-1443-4863-830e-0621abc3ed15-kube-api-access-mw9hw\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.487393 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.543850 4764 generic.go:334] "Generic (PLEG): container finished" podID="e921061b-2a0f-4b22-beb1-0d52993dc06b" containerID="fce1b5bc8a63a55a1345ac17564049805c448dc5638e52c8bc6cdc086dbbb27d" exitCode=0 Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.543898 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-p289h" event={"ID":"e921061b-2a0f-4b22-beb1-0d52993dc06b","Type":"ContainerDied","Data":"fce1b5bc8a63a55a1345ac17564049805c448dc5638e52c8bc6cdc086dbbb27d"} Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.570866 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" path="/var/lib/kubelet/pods/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62/volumes" Mar 09 13:40:57 crc kubenswrapper[4764]: I0309 13:40:57.436111 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-p289h" podUID="e921061b-2a0f-4b22-beb1-0d52993dc06b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Mar 09 13:40:58 crc kubenswrapper[4764]: I0309 13:40:58.370904 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:40:58 crc kubenswrapper[4764]: I0309 13:40:58.370963 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:40:58 crc kubenswrapper[4764]: I0309 13:40:58.371015 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:40:58 crc kubenswrapper[4764]: I0309 13:40:58.371771 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"089fede4f6de3963d0297d0b784543ed7a9d38cdefe66f6bbb9aab989dbffbb0"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:40:58 crc kubenswrapper[4764]: I0309 13:40:58.371823 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://089fede4f6de3963d0297d0b784543ed7a9d38cdefe66f6bbb9aab989dbffbb0" gracePeriod=600 Mar 09 13:40:58 crc kubenswrapper[4764]: I0309 13:40:58.596373 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="089fede4f6de3963d0297d0b784543ed7a9d38cdefe66f6bbb9aab989dbffbb0" exitCode=0 Mar 09 13:40:58 crc kubenswrapper[4764]: I0309 13:40:58.596439 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"089fede4f6de3963d0297d0b784543ed7a9d38cdefe66f6bbb9aab989dbffbb0"} Mar 09 13:40:58 crc kubenswrapper[4764]: I0309 13:40:58.596544 4764 scope.go:117] "RemoveContainer" containerID="bd218b04a62035d950083134e237631295493daecba2baa1eef096560f891819" Mar 09 13:41:02 crc kubenswrapper[4764]: I0309 13:41:02.636265 4764 generic.go:334] "Generic (PLEG): container finished" podID="34466abc-30eb-4a0c-b4ea-50b5ab368fa1" containerID="77b865a9fe4889c82ec1f58b4a21addc379165d3a6897c87913c6042aa1f357b" exitCode=0 Mar 09 13:41:02 crc kubenswrapper[4764]: I0309 13:41:02.636349 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cmhtp" event={"ID":"34466abc-30eb-4a0c-b4ea-50b5ab368fa1","Type":"ContainerDied","Data":"77b865a9fe4889c82ec1f58b4a21addc379165d3a6897c87913c6042aa1f357b"} Mar 09 13:41:02 crc kubenswrapper[4764]: E0309 13:41:02.759669 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 09 13:41:02 crc kubenswrapper[4764]: E0309 13:41:02.759878 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7jvrd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-x9gvc_openstack(cb54f57d-afb6-4e53-be9a-4b22573a9450): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:41:02 crc kubenswrapper[4764]: E0309 13:41:02.761034 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-x9gvc" podUID="cb54f57d-afb6-4e53-be9a-4b22573a9450" Mar 09 13:41:03 crc kubenswrapper[4764]: E0309 13:41:03.651060 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-x9gvc" podUID="cb54f57d-afb6-4e53-be9a-4b22573a9450" Mar 09 13:41:03 crc kubenswrapper[4764]: E0309 13:41:03.941132 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 09 13:41:03 crc kubenswrapper[4764]: E0309 13:41:03.941690 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-khssq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-dp5x6_openstack(74146b7d-9780-4d2d-9454-853296f88955): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:41:03 crc kubenswrapper[4764]: E0309 13:41:03.942906 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-dp5x6" podUID="74146b7d-9780-4d2d-9454-853296f88955" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.233025 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.283261 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.373961 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9sj6m"] Mar 09 13:41:04 crc kubenswrapper[4764]: W0309 13:41:04.385879 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a338463_1443_4863_830e_0621abc3ed15.slice/crio-38fa86d2f8bb849de0695f8ca7f22c2d7e73ace5965981aa2e9b4258c6a40283 WatchSource:0}: Error finding container 38fa86d2f8bb849de0695f8ca7f22c2d7e73ace5965981aa2e9b4258c6a40283: Status 404 returned error can't find the container with id 38fa86d2f8bb849de0695f8ca7f22c2d7e73ace5965981aa2e9b4258c6a40283 Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.392487 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-config\") pod \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.392630 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-combined-ca-bundle\") pod \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.392702 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkvzb\" (UniqueName: \"kubernetes.io/projected/e921061b-2a0f-4b22-beb1-0d52993dc06b-kube-api-access-xkvzb\") pod \"e921061b-2a0f-4b22-beb1-0d52993dc06b\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.392787 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-sb\") pod \"e921061b-2a0f-4b22-beb1-0d52993dc06b\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.392937 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-config\") pod \"e921061b-2a0f-4b22-beb1-0d52993dc06b\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.392988 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lj9f\" (UniqueName: \"kubernetes.io/projected/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-kube-api-access-8lj9f\") pod \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.393023 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-nb\") pod \"e921061b-2a0f-4b22-beb1-0d52993dc06b\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.393091 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-dns-svc\") pod \"e921061b-2a0f-4b22-beb1-0d52993dc06b\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.397464 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-kube-api-access-8lj9f" (OuterVolumeSpecName: "kube-api-access-8lj9f") pod "34466abc-30eb-4a0c-b4ea-50b5ab368fa1" (UID: "34466abc-30eb-4a0c-b4ea-50b5ab368fa1"). InnerVolumeSpecName "kube-api-access-8lj9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.397325 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e921061b-2a0f-4b22-beb1-0d52993dc06b-kube-api-access-xkvzb" (OuterVolumeSpecName: "kube-api-access-xkvzb") pod "e921061b-2a0f-4b22-beb1-0d52993dc06b" (UID: "e921061b-2a0f-4b22-beb1-0d52993dc06b"). InnerVolumeSpecName "kube-api-access-xkvzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.422623 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34466abc-30eb-4a0c-b4ea-50b5ab368fa1" (UID: "34466abc-30eb-4a0c-b4ea-50b5ab368fa1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.434931 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-config" (OuterVolumeSpecName: "config") pod "34466abc-30eb-4a0c-b4ea-50b5ab368fa1" (UID: "34466abc-30eb-4a0c-b4ea-50b5ab368fa1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.443779 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-config" (OuterVolumeSpecName: "config") pod "e921061b-2a0f-4b22-beb1-0d52993dc06b" (UID: "e921061b-2a0f-4b22-beb1-0d52993dc06b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.443864 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e921061b-2a0f-4b22-beb1-0d52993dc06b" (UID: "e921061b-2a0f-4b22-beb1-0d52993dc06b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.444421 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e921061b-2a0f-4b22-beb1-0d52993dc06b" (UID: "e921061b-2a0f-4b22-beb1-0d52993dc06b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.470242 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e921061b-2a0f-4b22-beb1-0d52993dc06b" (UID: "e921061b-2a0f-4b22-beb1-0d52993dc06b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.495028 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.495064 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lj9f\" (UniqueName: \"kubernetes.io/projected/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-kube-api-access-8lj9f\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.495078 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.495088 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.495100 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.495109 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.495117 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkvzb\" (UniqueName: \"kubernetes.io/projected/e921061b-2a0f-4b22-beb1-0d52993dc06b-kube-api-access-xkvzb\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.495125 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.661804 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-p289h" event={"ID":"e921061b-2a0f-4b22-beb1-0d52993dc06b","Type":"ContainerDied","Data":"9c01a77a060dbab4de5d1ba1f06fcd3807020da1983c9df162b0099cb08b09d0"} Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.661863 4764 scope.go:117] "RemoveContainer" containerID="fce1b5bc8a63a55a1345ac17564049805c448dc5638e52c8bc6cdc086dbbb27d" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.661967 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.664553 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.664566 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cmhtp" event={"ID":"34466abc-30eb-4a0c-b4ea-50b5ab368fa1","Type":"ContainerDied","Data":"7aa42fe65ff3f50e964cbe57813dcd4f0b9049443a2784caec6019418fe3ac3d"} Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.664680 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7aa42fe65ff3f50e964cbe57813dcd4f0b9049443a2784caec6019418fe3ac3d" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.667357 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bnpcj" event={"ID":"1004910c-0db4-4e3d-aac5-358a557ee268","Type":"ContainerStarted","Data":"ec23e6a8c58e7f0bc7312e60b63b0c20314c347b59fda500de1bb63ca3c5e6c6"} Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.670071 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9sj6m" event={"ID":"2a338463-1443-4863-830e-0621abc3ed15","Type":"ContainerStarted","Data":"bcb17cd274a1a85d7439c286f89b2351717de890e1de960b1f945d832ef377e9"} Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.670111 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9sj6m" event={"ID":"2a338463-1443-4863-830e-0621abc3ed15","Type":"ContainerStarted","Data":"38fa86d2f8bb849de0695f8ca7f22c2d7e73ace5965981aa2e9b4258c6a40283"} Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.682705 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"69810f80271be58962525ba5a5a37ce68d5e172f7c253c440a5a45f3f3beab77"} Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.688527 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"861cdd7d-b563-4009-9c33-a5c64d6ffae9","Type":"ContainerStarted","Data":"2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f"} Mar 09 13:41:04 crc kubenswrapper[4764]: E0309 13:41:04.704041 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-dp5x6" podUID="74146b7d-9780-4d2d-9454-853296f88955" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.704349 4764 scope.go:117] "RemoveContainer" containerID="7c314d6ef6d0a466bb5e22ea5e084d1cbdc6ed4e3e9ab6aee110b11fa331f5fb" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.707769 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-bnpcj" podStartSLOduration=3.192289793 podStartE2EDuration="23.707681248s" podCreationTimestamp="2026-03-09 13:40:41 +0000 UTC" firstStartedPulling="2026-03-09 13:40:43.392378153 +0000 UTC m=+1198.642550061" lastFinishedPulling="2026-03-09 13:41:03.907769608 +0000 UTC m=+1219.157941516" observedRunningTime="2026-03-09 13:41:04.702613631 +0000 UTC m=+1219.952785529" watchObservedRunningTime="2026-03-09 13:41:04.707681248 +0000 UTC m=+1219.957853156" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.724002 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-9sj6m" podStartSLOduration=11.723951326 podStartE2EDuration="11.723951326s" podCreationTimestamp="2026-03-09 13:40:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:04.722343166 +0000 UTC m=+1219.972515104" watchObservedRunningTime="2026-03-09 13:41:04.723951326 +0000 UTC m=+1219.974123244" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.772710 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-p289h"] Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.781188 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-p289h"] Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.841699 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-sh5rn"] Mar 09 13:41:04 crc kubenswrapper[4764]: E0309 13:41:04.842267 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e921061b-2a0f-4b22-beb1-0d52993dc06b" containerName="dnsmasq-dns" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.842288 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e921061b-2a0f-4b22-beb1-0d52993dc06b" containerName="dnsmasq-dns" Mar 09 13:41:04 crc kubenswrapper[4764]: E0309 13:41:04.842306 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34466abc-30eb-4a0c-b4ea-50b5ab368fa1" containerName="neutron-db-sync" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.842313 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="34466abc-30eb-4a0c-b4ea-50b5ab368fa1" containerName="neutron-db-sync" Mar 09 13:41:04 crc kubenswrapper[4764]: E0309 13:41:04.842344 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e921061b-2a0f-4b22-beb1-0d52993dc06b" containerName="init" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.842351 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e921061b-2a0f-4b22-beb1-0d52993dc06b" containerName="init" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.842529 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="34466abc-30eb-4a0c-b4ea-50b5ab368fa1" containerName="neutron-db-sync" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.842547 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e921061b-2a0f-4b22-beb1-0d52993dc06b" containerName="dnsmasq-dns" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.844141 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.854335 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-sh5rn"] Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.964282 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6b74bc6bc6-vsxl5"] Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.966583 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.970015 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.970496 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.970925 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.971223 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-fn2ft" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.979427 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b74bc6bc6-vsxl5"] Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.006431 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-config\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.006494 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f92p7\" (UniqueName: \"kubernetes.io/projected/5b2b268a-adc9-46ca-908a-d30ab8543059-kube-api-access-f92p7\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.006544 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.006564 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.006614 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.108494 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnqrq\" (UniqueName: \"kubernetes.io/projected/50610296-d076-4c9f-ac34-a976202ce135-kube-api-access-xnqrq\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.108613 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-httpd-config\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.108666 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-combined-ca-bundle\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.108699 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-config\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.108728 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f92p7\" (UniqueName: \"kubernetes.io/projected/5b2b268a-adc9-46ca-908a-d30ab8543059-kube-api-access-f92p7\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.108762 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.108781 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.108809 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-ovndb-tls-certs\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.108835 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.108857 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-config\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.110005 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.110232 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.110853 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-config\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.111269 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.146229 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f92p7\" (UniqueName: \"kubernetes.io/projected/5b2b268a-adc9-46ca-908a-d30ab8543059-kube-api-access-f92p7\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.183494 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.210038 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-ovndb-tls-certs\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.210092 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-config\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.210143 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnqrq\" (UniqueName: \"kubernetes.io/projected/50610296-d076-4c9f-ac34-a976202ce135-kube-api-access-xnqrq\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.210188 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-httpd-config\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.210230 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-combined-ca-bundle\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.218577 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-combined-ca-bundle\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.218576 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-ovndb-tls-certs\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.218983 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-config\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.220009 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-httpd-config\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.239010 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnqrq\" (UniqueName: \"kubernetes.io/projected/50610296-d076-4c9f-ac34-a976202ce135-kube-api-access-xnqrq\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.287124 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.596728 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e921061b-2a0f-4b22-beb1-0d52993dc06b" path="/var/lib/kubelet/pods/e921061b-2a0f-4b22-beb1-0d52993dc06b/volumes" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.792327 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-sh5rn"] Mar 09 13:41:06 crc kubenswrapper[4764]: I0309 13:41:06.017226 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b74bc6bc6-vsxl5"] Mar 09 13:41:06 crc kubenswrapper[4764]: I0309 13:41:06.739846 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b74bc6bc6-vsxl5" event={"ID":"50610296-d076-4c9f-ac34-a976202ce135","Type":"ContainerStarted","Data":"508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed"} Mar 09 13:41:06 crc kubenswrapper[4764]: I0309 13:41:06.741154 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b74bc6bc6-vsxl5" event={"ID":"50610296-d076-4c9f-ac34-a976202ce135","Type":"ContainerStarted","Data":"bd4c896bc38b604cb19726769c37db30c4145f3642057a166913e3d7cfd24c8f"} Mar 09 13:41:06 crc kubenswrapper[4764]: I0309 13:41:06.748594 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"861cdd7d-b563-4009-9c33-a5c64d6ffae9","Type":"ContainerStarted","Data":"4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2"} Mar 09 13:41:06 crc kubenswrapper[4764]: I0309 13:41:06.752782 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" event={"ID":"5b2b268a-adc9-46ca-908a-d30ab8543059","Type":"ContainerStarted","Data":"8419a9151f23a452d93428da7d46b2dd5d9147864269035236c144959618c6a4"} Mar 09 13:41:06 crc kubenswrapper[4764]: I0309 13:41:06.753054 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" event={"ID":"5b2b268a-adc9-46ca-908a-d30ab8543059","Type":"ContainerStarted","Data":"d82d6336b352da3e8f6c37dd12f8ae1d4a1f5e3d8a23342826dc108286471929"} Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.216657 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6c97985d69-khcvg"] Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.228634 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.232280 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.238292 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.240026 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c97985d69-khcvg"] Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.377068 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnnlq\" (UniqueName: \"kubernetes.io/projected/5d65fa53-be02-4d11-b300-5cb4629c03da-kube-api-access-bnnlq\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.377131 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-public-tls-certs\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.377339 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-config\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.377602 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-httpd-config\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.377636 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-combined-ca-bundle\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.377677 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-ovndb-tls-certs\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.377730 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-internal-tls-certs\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.436834 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-p289h" podUID="e921061b-2a0f-4b22-beb1-0d52993dc06b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: i/o timeout" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.480042 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-config\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.480182 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-httpd-config\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.480258 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-combined-ca-bundle\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.480294 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-ovndb-tls-certs\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.480337 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-internal-tls-certs\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.480574 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnnlq\" (UniqueName: \"kubernetes.io/projected/5d65fa53-be02-4d11-b300-5cb4629c03da-kube-api-access-bnnlq\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.480629 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-public-tls-certs\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.488735 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-config\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.490503 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-combined-ca-bundle\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.496502 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-public-tls-certs\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.498345 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-httpd-config\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.506584 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnnlq\" (UniqueName: \"kubernetes.io/projected/5d65fa53-be02-4d11-b300-5cb4629c03da-kube-api-access-bnnlq\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.509142 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-internal-tls-certs\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.518086 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-ovndb-tls-certs\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.551316 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.764557 4764 generic.go:334] "Generic (PLEG): container finished" podID="1004910c-0db4-4e3d-aac5-358a557ee268" containerID="ec23e6a8c58e7f0bc7312e60b63b0c20314c347b59fda500de1bb63ca3c5e6c6" exitCode=0 Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.764672 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bnpcj" event={"ID":"1004910c-0db4-4e3d-aac5-358a557ee268","Type":"ContainerDied","Data":"ec23e6a8c58e7f0bc7312e60b63b0c20314c347b59fda500de1bb63ca3c5e6c6"} Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.778425 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b74bc6bc6-vsxl5" event={"ID":"50610296-d076-4c9f-ac34-a976202ce135","Type":"ContainerStarted","Data":"40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9"} Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.779470 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.810174 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6b74bc6bc6-vsxl5" podStartSLOduration=3.810140571 podStartE2EDuration="3.810140571s" podCreationTimestamp="2026-03-09 13:41:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:07.809781502 +0000 UTC m=+1223.059953420" watchObservedRunningTime="2026-03-09 13:41:07.810140571 +0000 UTC m=+1223.060312479" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.810799 4764 generic.go:334] "Generic (PLEG): container finished" podID="5b2b268a-adc9-46ca-908a-d30ab8543059" containerID="8419a9151f23a452d93428da7d46b2dd5d9147864269035236c144959618c6a4" exitCode=0 Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.810941 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" event={"ID":"5b2b268a-adc9-46ca-908a-d30ab8543059","Type":"ContainerDied","Data":"8419a9151f23a452d93428da7d46b2dd5d9147864269035236c144959618c6a4"} Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.810977 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" event={"ID":"5b2b268a-adc9-46ca-908a-d30ab8543059","Type":"ContainerStarted","Data":"20dab72734e5903795d478bac44970ac57f0690f8f454dee9dae223fbd3e92ac"} Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.812370 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.914685 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" podStartSLOduration=3.9146523 podStartE2EDuration="3.9146523s" podCreationTimestamp="2026-03-09 13:41:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:07.859897658 +0000 UTC m=+1223.110069566" watchObservedRunningTime="2026-03-09 13:41:07.9146523 +0000 UTC m=+1223.164824208" Mar 09 13:41:08 crc kubenswrapper[4764]: I0309 13:41:08.183315 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c97985d69-khcvg"] Mar 09 13:41:08 crc kubenswrapper[4764]: I0309 13:41:08.820962 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c97985d69-khcvg" event={"ID":"5d65fa53-be02-4d11-b300-5cb4629c03da","Type":"ContainerStarted","Data":"63c3a5316fc5ec3fc301ebe753725e716ab876289ed6944f416ebd4b78894abb"} Mar 09 13:41:08 crc kubenswrapper[4764]: I0309 13:41:08.824007 4764 generic.go:334] "Generic (PLEG): container finished" podID="2a338463-1443-4863-830e-0621abc3ed15" containerID="bcb17cd274a1a85d7439c286f89b2351717de890e1de960b1f945d832ef377e9" exitCode=0 Mar 09 13:41:08 crc kubenswrapper[4764]: I0309 13:41:08.824064 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9sj6m" event={"ID":"2a338463-1443-4863-830e-0621abc3ed15","Type":"ContainerDied","Data":"bcb17cd274a1a85d7439c286f89b2351717de890e1de960b1f945d832ef377e9"} Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.298483 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bnpcj" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.426954 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9qxl\" (UniqueName: \"kubernetes.io/projected/1004910c-0db4-4e3d-aac5-358a557ee268-kube-api-access-v9qxl\") pod \"1004910c-0db4-4e3d-aac5-358a557ee268\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.427225 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1004910c-0db4-4e3d-aac5-358a557ee268-logs\") pod \"1004910c-0db4-4e3d-aac5-358a557ee268\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.427259 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-scripts\") pod \"1004910c-0db4-4e3d-aac5-358a557ee268\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.427294 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-config-data\") pod \"1004910c-0db4-4e3d-aac5-358a557ee268\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.427316 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-combined-ca-bundle\") pod \"1004910c-0db4-4e3d-aac5-358a557ee268\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.427912 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1004910c-0db4-4e3d-aac5-358a557ee268-logs" (OuterVolumeSpecName: "logs") pod "1004910c-0db4-4e3d-aac5-358a557ee268" (UID: "1004910c-0db4-4e3d-aac5-358a557ee268"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.437236 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-scripts" (OuterVolumeSpecName: "scripts") pod "1004910c-0db4-4e3d-aac5-358a557ee268" (UID: "1004910c-0db4-4e3d-aac5-358a557ee268"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.456159 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1004910c-0db4-4e3d-aac5-358a557ee268-kube-api-access-v9qxl" (OuterVolumeSpecName: "kube-api-access-v9qxl") pod "1004910c-0db4-4e3d-aac5-358a557ee268" (UID: "1004910c-0db4-4e3d-aac5-358a557ee268"). InnerVolumeSpecName "kube-api-access-v9qxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.460447 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1004910c-0db4-4e3d-aac5-358a557ee268" (UID: "1004910c-0db4-4e3d-aac5-358a557ee268"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.461974 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-config-data" (OuterVolumeSpecName: "config-data") pod "1004910c-0db4-4e3d-aac5-358a557ee268" (UID: "1004910c-0db4-4e3d-aac5-358a557ee268"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.529690 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1004910c-0db4-4e3d-aac5-358a557ee268-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.529733 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.529746 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.529760 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.529776 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9qxl\" (UniqueName: \"kubernetes.io/projected/1004910c-0db4-4e3d-aac5-358a557ee268-kube-api-access-v9qxl\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.859675 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bnpcj" event={"ID":"1004910c-0db4-4e3d-aac5-358a557ee268","Type":"ContainerDied","Data":"4a62e7bc40a3d288cb2716e66a0b772309aa68a835605c18dfc2a81767e2ae07"} Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.859733 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a62e7bc40a3d288cb2716e66a0b772309aa68a835605c18dfc2a81767e2ae07" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.859770 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bnpcj" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.871559 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c97985d69-khcvg" event={"ID":"5d65fa53-be02-4d11-b300-5cb4629c03da","Type":"ContainerStarted","Data":"af86c5d555559ee78e3558628ecf704c4c51493754120be1f860b79edb606896"} Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.900349 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-586d68b4fd-xj4tk"] Mar 09 13:41:09 crc kubenswrapper[4764]: E0309 13:41:09.900841 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1004910c-0db4-4e3d-aac5-358a557ee268" containerName="placement-db-sync" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.900862 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1004910c-0db4-4e3d-aac5-358a557ee268" containerName="placement-db-sync" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.901336 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1004910c-0db4-4e3d-aac5-358a557ee268" containerName="placement-db-sync" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.902270 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.908132 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.913828 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.914183 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.914431 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-l4tqv" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.914600 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.926478 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-586d68b4fd-xj4tk"] Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.040373 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-config-data\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.040508 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-combined-ca-bundle\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.040579 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-internal-tls-certs\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.040611 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-public-tls-certs\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.040769 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-scripts\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.040799 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8mnp\" (UniqueName: \"kubernetes.io/projected/e89507a7-04a1-444b-b38a-40b001ec079a-kube-api-access-j8mnp\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.041044 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e89507a7-04a1-444b-b38a-40b001ec079a-logs\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.145240 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-config-data\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.145357 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-combined-ca-bundle\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.145408 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-internal-tls-certs\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.145433 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-public-tls-certs\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.145460 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-scripts\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.145757 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8mnp\" (UniqueName: \"kubernetes.io/projected/e89507a7-04a1-444b-b38a-40b001ec079a-kube-api-access-j8mnp\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.145811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e89507a7-04a1-444b-b38a-40b001ec079a-logs\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.150599 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e89507a7-04a1-444b-b38a-40b001ec079a-logs\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.151499 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-internal-tls-certs\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.152189 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-config-data\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.168574 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-public-tls-certs\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.169638 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-combined-ca-bundle\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.173916 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8mnp\" (UniqueName: \"kubernetes.io/projected/e89507a7-04a1-444b-b38a-40b001ec079a-kube-api-access-j8mnp\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.190238 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-scripts\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.244508 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.863873 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.901745 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9sj6m" event={"ID":"2a338463-1443-4863-830e-0621abc3ed15","Type":"ContainerDied","Data":"38fa86d2f8bb849de0695f8ca7f22c2d7e73ace5965981aa2e9b4258c6a40283"} Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.901792 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38fa86d2f8bb849de0695f8ca7f22c2d7e73ace5965981aa2e9b4258c6a40283" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.901921 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.967128 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-combined-ca-bundle\") pod \"2a338463-1443-4863-830e-0621abc3ed15\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.967587 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw9hw\" (UniqueName: \"kubernetes.io/projected/2a338463-1443-4863-830e-0621abc3ed15-kube-api-access-mw9hw\") pod \"2a338463-1443-4863-830e-0621abc3ed15\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.967668 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-config-data\") pod \"2a338463-1443-4863-830e-0621abc3ed15\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.967697 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-fernet-keys\") pod \"2a338463-1443-4863-830e-0621abc3ed15\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.967956 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-credential-keys\") pod \"2a338463-1443-4863-830e-0621abc3ed15\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.968017 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-scripts\") pod \"2a338463-1443-4863-830e-0621abc3ed15\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.973611 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a338463-1443-4863-830e-0621abc3ed15-kube-api-access-mw9hw" (OuterVolumeSpecName: "kube-api-access-mw9hw") pod "2a338463-1443-4863-830e-0621abc3ed15" (UID: "2a338463-1443-4863-830e-0621abc3ed15"). InnerVolumeSpecName "kube-api-access-mw9hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.973629 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2a338463-1443-4863-830e-0621abc3ed15" (UID: "2a338463-1443-4863-830e-0621abc3ed15"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.975087 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-scripts" (OuterVolumeSpecName: "scripts") pod "2a338463-1443-4863-830e-0621abc3ed15" (UID: "2a338463-1443-4863-830e-0621abc3ed15"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.990614 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2a338463-1443-4863-830e-0621abc3ed15" (UID: "2a338463-1443-4863-830e-0621abc3ed15"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.997332 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a338463-1443-4863-830e-0621abc3ed15" (UID: "2a338463-1443-4863-830e-0621abc3ed15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.008614 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-config-data" (OuterVolumeSpecName: "config-data") pod "2a338463-1443-4863-830e-0621abc3ed15" (UID: "2a338463-1443-4863-830e-0621abc3ed15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.070048 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.070318 4764 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.070407 4764 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.070491 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.070548 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.070602 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw9hw\" (UniqueName: \"kubernetes.io/projected/2a338463-1443-4863-830e-0621abc3ed15-kube-api-access-mw9hw\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.977917 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-759c9c64fb-nwls6"] Mar 09 13:41:11 crc kubenswrapper[4764]: E0309 13:41:11.979409 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a338463-1443-4863-830e-0621abc3ed15" containerName="keystone-bootstrap" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.979476 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a338463-1443-4863-830e-0621abc3ed15" containerName="keystone-bootstrap" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.979729 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a338463-1443-4863-830e-0621abc3ed15" containerName="keystone-bootstrap" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.980704 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.983501 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.984918 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.985217 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.985413 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.985600 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4hfql" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.985766 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.993409 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-759c9c64fb-nwls6"] Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.089193 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-scripts\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.089280 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-fernet-keys\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.089304 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-combined-ca-bundle\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.089335 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-internal-tls-certs\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.089354 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-credential-keys\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.089384 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcw5v\" (UniqueName: \"kubernetes.io/projected/48b871c4-f2e8-44e9-9268-54920414c084-kube-api-access-xcw5v\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.089410 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-config-data\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.089451 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-public-tls-certs\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.191041 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-internal-tls-certs\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.191116 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-credential-keys\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.191144 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcw5v\" (UniqueName: \"kubernetes.io/projected/48b871c4-f2e8-44e9-9268-54920414c084-kube-api-access-xcw5v\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.191174 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-config-data\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.191223 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-public-tls-certs\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.191279 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-scripts\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.191322 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-fernet-keys\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.191343 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-combined-ca-bundle\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.203915 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-combined-ca-bundle\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.204782 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-credential-keys\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.212811 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-internal-tls-certs\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.213298 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-fernet-keys\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.213372 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-config-data\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.213850 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-scripts\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.223625 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-public-tls-certs\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.240298 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcw5v\" (UniqueName: \"kubernetes.io/projected/48b871c4-f2e8-44e9-9268-54920414c084-kube-api-access-xcw5v\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.308427 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.433071 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-f85c59cb-gm4df"] Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.436414 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.458603 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f85c59cb-gm4df"] Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.601610 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-public-tls-certs\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.601781 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-config-data\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.601817 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvp6f\" (UniqueName: \"kubernetes.io/projected/2a26a533-a42a-4553-96b3-922ad860ca7a-kube-api-access-vvp6f\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.601845 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a26a533-a42a-4553-96b3-922ad860ca7a-logs\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.601905 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-combined-ca-bundle\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.601964 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-internal-tls-certs\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.601996 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-scripts\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.706130 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-config-data\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.706190 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvp6f\" (UniqueName: \"kubernetes.io/projected/2a26a533-a42a-4553-96b3-922ad860ca7a-kube-api-access-vvp6f\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.706229 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a26a533-a42a-4553-96b3-922ad860ca7a-logs\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.706273 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-combined-ca-bundle\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.706876 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-internal-tls-certs\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.707041 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-scripts\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.707108 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-public-tls-certs\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.708391 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a26a533-a42a-4553-96b3-922ad860ca7a-logs\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.711638 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-internal-tls-certs\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.713137 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-scripts\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.713342 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-config-data\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.721425 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-combined-ca-bundle\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.725265 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvp6f\" (UniqueName: \"kubernetes.io/projected/2a26a533-a42a-4553-96b3-922ad860ca7a-kube-api-access-vvp6f\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.725315 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-public-tls-certs\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.765812 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:13 crc kubenswrapper[4764]: I0309 13:41:13.773636 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-586d68b4fd-xj4tk"] Mar 09 13:41:13 crc kubenswrapper[4764]: W0309 13:41:13.783106 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode89507a7_04a1_444b_b38a_40b001ec079a.slice/crio-521edccf936d07531fd9f618513cede620e92cebf6edb91c3eb8e6d1d0940b40 WatchSource:0}: Error finding container 521edccf936d07531fd9f618513cede620e92cebf6edb91c3eb8e6d1d0940b40: Status 404 returned error can't find the container with id 521edccf936d07531fd9f618513cede620e92cebf6edb91c3eb8e6d1d0940b40 Mar 09 13:41:13 crc kubenswrapper[4764]: I0309 13:41:13.872384 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f85c59cb-gm4df"] Mar 09 13:41:13 crc kubenswrapper[4764]: I0309 13:41:13.937877 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"861cdd7d-b563-4009-9c33-a5c64d6ffae9","Type":"ContainerStarted","Data":"d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff"} Mar 09 13:41:13 crc kubenswrapper[4764]: I0309 13:41:13.939552 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-586d68b4fd-xj4tk" event={"ID":"e89507a7-04a1-444b-b38a-40b001ec079a","Type":"ContainerStarted","Data":"521edccf936d07531fd9f618513cede620e92cebf6edb91c3eb8e6d1d0940b40"} Mar 09 13:41:13 crc kubenswrapper[4764]: I0309 13:41:13.940701 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f85c59cb-gm4df" event={"ID":"2a26a533-a42a-4553-96b3-922ad860ca7a","Type":"ContainerStarted","Data":"dc2f6cf195a7a4d81bbe22d2d05f6dab5cd71f4bbb5aa5f4f57465a6ba1dbaf0"} Mar 09 13:41:13 crc kubenswrapper[4764]: I0309 13:41:13.942696 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c97985d69-khcvg" event={"ID":"5d65fa53-be02-4d11-b300-5cb4629c03da","Type":"ContainerStarted","Data":"cff13fa571b5cb0d7d8cf860c4a1efdb55518757df6a7508bbd7dca053bc0e44"} Mar 09 13:41:13 crc kubenswrapper[4764]: I0309 13:41:13.945094 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:13 crc kubenswrapper[4764]: I0309 13:41:13.970992 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-759c9c64fb-nwls6"] Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.002587 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6c97985d69-khcvg" podStartSLOduration=7.002559994 podStartE2EDuration="7.002559994s" podCreationTimestamp="2026-03-09 13:41:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:13.990290006 +0000 UTC m=+1229.240461914" watchObservedRunningTime="2026-03-09 13:41:14.002559994 +0000 UTC m=+1229.252731902" Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.952052 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f85c59cb-gm4df" event={"ID":"2a26a533-a42a-4553-96b3-922ad860ca7a","Type":"ContainerStarted","Data":"0428883faa796852458c72233641e433d39f3470e2adf94bdc2955214fca65d0"} Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.952407 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.952428 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.952442 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f85c59cb-gm4df" event={"ID":"2a26a533-a42a-4553-96b3-922ad860ca7a","Type":"ContainerStarted","Data":"1a183d4acff4926d12823fed1357a77c6cf0ac1cd4db68a8320ca592acf90a8d"} Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.955042 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-759c9c64fb-nwls6" event={"ID":"48b871c4-f2e8-44e9-9268-54920414c084","Type":"ContainerStarted","Data":"2f8fad3993132a09c0f6aa7d6a907699fbfb2983fa82f5bdb6eabaff82cf0739"} Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.955120 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.955131 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-759c9c64fb-nwls6" event={"ID":"48b871c4-f2e8-44e9-9268-54920414c084","Type":"ContainerStarted","Data":"a621183f66c1c30c816f1b516021cdd36424bc24a400ea319637f857bdd39514"} Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.957546 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-586d68b4fd-xj4tk" event={"ID":"e89507a7-04a1-444b-b38a-40b001ec079a","Type":"ContainerStarted","Data":"6939747e3ae2dc764ad9ad287190d81cffed4e3b8fef6031a7e72a9769a88873"} Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.957578 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-586d68b4fd-xj4tk" event={"ID":"e89507a7-04a1-444b-b38a-40b001ec079a","Type":"ContainerStarted","Data":"846d1c4f6ed0185cda05e320aba446c1f9b2558f2a5da71ef147be109e27b726"} Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.987312 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-f85c59cb-gm4df" podStartSLOduration=2.987289876 podStartE2EDuration="2.987289876s" podCreationTimestamp="2026-03-09 13:41:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:14.973750827 +0000 UTC m=+1230.223922745" watchObservedRunningTime="2026-03-09 13:41:14.987289876 +0000 UTC m=+1230.237461784" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.001821 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-759c9c64fb-nwls6" podStartSLOduration=4.00178865 podStartE2EDuration="4.00178865s" podCreationTimestamp="2026-03-09 13:41:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:14.994467086 +0000 UTC m=+1230.244639004" watchObservedRunningTime="2026-03-09 13:41:15.00178865 +0000 UTC m=+1230.251960568" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.025045 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-586d68b4fd-xj4tk" podStartSLOduration=6.025023392 podStartE2EDuration="6.025023392s" podCreationTimestamp="2026-03-09 13:41:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:15.0165684 +0000 UTC m=+1230.266740308" watchObservedRunningTime="2026-03-09 13:41:15.025023392 +0000 UTC m=+1230.275195320" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.185957 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.251994 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-47q58"] Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.252292 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" podUID="f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" containerName="dnsmasq-dns" containerID="cri-o://1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22" gracePeriod=10 Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.794073 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.865095 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-sb\") pod \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.866038 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-nb\") pod \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.866307 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-config\") pod \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.866467 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-dns-svc\") pod \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.866493 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4km7\" (UniqueName: \"kubernetes.io/projected/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-kube-api-access-p4km7\") pod \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.875062 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-kube-api-access-p4km7" (OuterVolumeSpecName: "kube-api-access-p4km7") pod "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" (UID: "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1"). InnerVolumeSpecName "kube-api-access-p4km7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.919041 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" (UID: "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.928177 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-config" (OuterVolumeSpecName: "config") pod "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" (UID: "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.947622 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" (UID: "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.967628 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" (UID: "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.968390 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-nb\") pod \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " Mar 09 13:41:15 crc kubenswrapper[4764]: W0309 13:41:15.969170 4764 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1/volumes/kubernetes.io~configmap/ovsdbserver-nb Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.969275 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" (UID: "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.971497 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.971695 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.971718 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4km7\" (UniqueName: \"kubernetes.io/projected/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-kube-api-access-p4km7\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.971733 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.972084 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.972851 4764 generic.go:334] "Generic (PLEG): container finished" podID="f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" containerID="1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22" exitCode=0 Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.973804 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" event={"ID":"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1","Type":"ContainerDied","Data":"1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22"} Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.973854 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.974092 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" event={"ID":"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1","Type":"ContainerDied","Data":"29094e7809129e1e7698bb9912d54d76c17888d5bfac065889c5bd5838e4b71c"} Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.974124 4764 scope.go:117] "RemoveContainer" containerID="1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.975076 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.975770 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:16 crc kubenswrapper[4764]: I0309 13:41:16.022921 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-47q58"] Mar 09 13:41:16 crc kubenswrapper[4764]: I0309 13:41:16.027728 4764 scope.go:117] "RemoveContainer" containerID="643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333" Mar 09 13:41:16 crc kubenswrapper[4764]: I0309 13:41:16.029932 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-47q58"] Mar 09 13:41:16 crc kubenswrapper[4764]: I0309 13:41:16.059314 4764 scope.go:117] "RemoveContainer" containerID="1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22" Mar 09 13:41:16 crc kubenswrapper[4764]: E0309 13:41:16.060106 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22\": container with ID starting with 1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22 not found: ID does not exist" containerID="1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22" Mar 09 13:41:16 crc kubenswrapper[4764]: I0309 13:41:16.060165 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22"} err="failed to get container status \"1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22\": rpc error: code = NotFound desc = could not find container \"1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22\": container with ID starting with 1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22 not found: ID does not exist" Mar 09 13:41:16 crc kubenswrapper[4764]: I0309 13:41:16.060188 4764 scope.go:117] "RemoveContainer" containerID="643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333" Mar 09 13:41:16 crc kubenswrapper[4764]: E0309 13:41:16.060741 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333\": container with ID starting with 643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333 not found: ID does not exist" containerID="643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333" Mar 09 13:41:16 crc kubenswrapper[4764]: I0309 13:41:16.060768 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333"} err="failed to get container status \"643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333\": rpc error: code = NotFound desc = could not find container \"643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333\": container with ID starting with 643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333 not found: ID does not exist" Mar 09 13:41:16 crc kubenswrapper[4764]: I0309 13:41:16.985696 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x9gvc" event={"ID":"cb54f57d-afb6-4e53-be9a-4b22573a9450","Type":"ContainerStarted","Data":"e95470c676ddedadba89efafc3707fbce528908b0bfa879405e032128d81cc49"} Mar 09 13:41:17 crc kubenswrapper[4764]: I0309 13:41:17.007283 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-x9gvc" podStartSLOduration=3.2118654429999998 podStartE2EDuration="36.007266077s" podCreationTimestamp="2026-03-09 13:40:41 +0000 UTC" firstStartedPulling="2026-03-09 13:40:43.339839636 +0000 UTC m=+1198.590011544" lastFinishedPulling="2026-03-09 13:41:16.13524027 +0000 UTC m=+1231.385412178" observedRunningTime="2026-03-09 13:41:17.001731309 +0000 UTC m=+1232.251903217" watchObservedRunningTime="2026-03-09 13:41:17.007266077 +0000 UTC m=+1232.257437985" Mar 09 13:41:17 crc kubenswrapper[4764]: I0309 13:41:17.571175 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" path="/var/lib/kubelet/pods/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1/volumes" Mar 09 13:41:19 crc kubenswrapper[4764]: I0309 13:41:19.012355 4764 generic.go:334] "Generic (PLEG): container finished" podID="cb54f57d-afb6-4e53-be9a-4b22573a9450" containerID="e95470c676ddedadba89efafc3707fbce528908b0bfa879405e032128d81cc49" exitCode=0 Mar 09 13:41:19 crc kubenswrapper[4764]: I0309 13:41:19.012525 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x9gvc" event={"ID":"cb54f57d-afb6-4e53-be9a-4b22573a9450","Type":"ContainerDied","Data":"e95470c676ddedadba89efafc3707fbce528908b0bfa879405e032128d81cc49"} Mar 09 13:41:21 crc kubenswrapper[4764]: I0309 13:41:21.618220 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:41:21 crc kubenswrapper[4764]: I0309 13:41:21.689676 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jvrd\" (UniqueName: \"kubernetes.io/projected/cb54f57d-afb6-4e53-be9a-4b22573a9450-kube-api-access-7jvrd\") pod \"cb54f57d-afb6-4e53-be9a-4b22573a9450\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " Mar 09 13:41:21 crc kubenswrapper[4764]: I0309 13:41:21.689750 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-db-sync-config-data\") pod \"cb54f57d-afb6-4e53-be9a-4b22573a9450\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " Mar 09 13:41:21 crc kubenswrapper[4764]: I0309 13:41:21.689822 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-combined-ca-bundle\") pod \"cb54f57d-afb6-4e53-be9a-4b22573a9450\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " Mar 09 13:41:21 crc kubenswrapper[4764]: I0309 13:41:21.701674 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb54f57d-afb6-4e53-be9a-4b22573a9450-kube-api-access-7jvrd" (OuterVolumeSpecName: "kube-api-access-7jvrd") pod "cb54f57d-afb6-4e53-be9a-4b22573a9450" (UID: "cb54f57d-afb6-4e53-be9a-4b22573a9450"). InnerVolumeSpecName "kube-api-access-7jvrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:21 crc kubenswrapper[4764]: I0309 13:41:21.703854 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "cb54f57d-afb6-4e53-be9a-4b22573a9450" (UID: "cb54f57d-afb6-4e53-be9a-4b22573a9450"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:21 crc kubenswrapper[4764]: I0309 13:41:21.720389 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb54f57d-afb6-4e53-be9a-4b22573a9450" (UID: "cb54f57d-afb6-4e53-be9a-4b22573a9450"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:21 crc kubenswrapper[4764]: I0309 13:41:21.793594 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jvrd\" (UniqueName: \"kubernetes.io/projected/cb54f57d-afb6-4e53-be9a-4b22573a9450-kube-api-access-7jvrd\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:21 crc kubenswrapper[4764]: I0309 13:41:21.793635 4764 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:21 crc kubenswrapper[4764]: I0309 13:41:21.793656 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.060715 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.060714 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x9gvc" event={"ID":"cb54f57d-afb6-4e53-be9a-4b22573a9450","Type":"ContainerDied","Data":"7611ab820e5631fb861e6ee00f8e6a6553e8f141cb68d0a4556a1a0beeb23d3b"} Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.060853 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7611ab820e5631fb861e6ee00f8e6a6553e8f141cb68d0a4556a1a0beeb23d3b" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.872693 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-789c56cf69-2dj2c"] Mar 09 13:41:22 crc kubenswrapper[4764]: E0309 13:41:22.873900 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" containerName="dnsmasq-dns" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.873915 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" containerName="dnsmasq-dns" Mar 09 13:41:22 crc kubenswrapper[4764]: E0309 13:41:22.873936 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" containerName="init" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.873942 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" containerName="init" Mar 09 13:41:22 crc kubenswrapper[4764]: E0309 13:41:22.873959 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb54f57d-afb6-4e53-be9a-4b22573a9450" containerName="barbican-db-sync" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.873966 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb54f57d-afb6-4e53-be9a-4b22573a9450" containerName="barbican-db-sync" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.874152 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" containerName="dnsmasq-dns" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.874169 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb54f57d-afb6-4e53-be9a-4b22573a9450" containerName="barbican-db-sync" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.880163 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.882320 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.882666 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.885456 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-j8zg7" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.901404 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7c6f54974-hws5g"] Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.903087 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.906050 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.918763 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-789c56cf69-2dj2c"] Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.948365 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c6f54974-hws5g"] Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.018011 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18071d3-1164-4080-9095-919bb5349bb8-combined-ca-bundle\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.018070 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf2cd\" (UniqueName: \"kubernetes.io/projected/154490f8-97ab-4703-a96c-16b6d5f7a178-kube-api-access-wf2cd\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.018106 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a18071d3-1164-4080-9095-919bb5349bb8-config-data\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.018144 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a18071d3-1164-4080-9095-919bb5349bb8-config-data-custom\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.018170 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/154490f8-97ab-4703-a96c-16b6d5f7a178-config-data-custom\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.018205 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154490f8-97ab-4703-a96c-16b6d5f7a178-logs\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.018254 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154490f8-97ab-4703-a96c-16b6d5f7a178-config-data\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.018276 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzfmq\" (UniqueName: \"kubernetes.io/projected/a18071d3-1164-4080-9095-919bb5349bb8-kube-api-access-zzfmq\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.018317 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154490f8-97ab-4703-a96c-16b6d5f7a178-combined-ca-bundle\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.018344 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a18071d3-1164-4080-9095-919bb5349bb8-logs\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.022880 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-869f779d85-qsc97"] Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.024572 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.054461 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-qsc97"] Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.120410 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18071d3-1164-4080-9095-919bb5349bb8-combined-ca-bundle\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.120457 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf2cd\" (UniqueName: \"kubernetes.io/projected/154490f8-97ab-4703-a96c-16b6d5f7a178-kube-api-access-wf2cd\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.120516 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a18071d3-1164-4080-9095-919bb5349bb8-config-data\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.120571 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-nb\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.120685 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a18071d3-1164-4080-9095-919bb5349bb8-config-data-custom\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.120715 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-dns-svc\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.120913 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/154490f8-97ab-4703-a96c-16b6d5f7a178-config-data-custom\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.120962 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154490f8-97ab-4703-a96c-16b6d5f7a178-logs\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.121070 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-config\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.121669 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-489r4\" (UniqueName: \"kubernetes.io/projected/c38cb253-b945-43b3-8dcd-209682d40f11-kube-api-access-489r4\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.121711 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154490f8-97ab-4703-a96c-16b6d5f7a178-config-data\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.121736 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzfmq\" (UniqueName: \"kubernetes.io/projected/a18071d3-1164-4080-9095-919bb5349bb8-kube-api-access-zzfmq\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.121823 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154490f8-97ab-4703-a96c-16b6d5f7a178-combined-ca-bundle\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.121858 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a18071d3-1164-4080-9095-919bb5349bb8-logs\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.121878 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-sb\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.122419 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154490f8-97ab-4703-a96c-16b6d5f7a178-logs\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.124662 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dp5x6" event={"ID":"74146b7d-9780-4d2d-9454-853296f88955","Type":"ContainerStarted","Data":"1a80c7f5de2d815f10d1b147b52b1c923bf4e3266278afd841a98b5bc66a8ad6"} Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.126544 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a18071d3-1164-4080-9095-919bb5349bb8-logs\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.135546 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a18071d3-1164-4080-9095-919bb5349bb8-config-data-custom\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.136286 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18071d3-1164-4080-9095-919bb5349bb8-combined-ca-bundle\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.145716 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzfmq\" (UniqueName: \"kubernetes.io/projected/a18071d3-1164-4080-9095-919bb5349bb8-kube-api-access-zzfmq\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.155981 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154490f8-97ab-4703-a96c-16b6d5f7a178-combined-ca-bundle\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.159636 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/154490f8-97ab-4703-a96c-16b6d5f7a178-config-data-custom\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.160748 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a18071d3-1164-4080-9095-919bb5349bb8-config-data\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.161218 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf2cd\" (UniqueName: \"kubernetes.io/projected/154490f8-97ab-4703-a96c-16b6d5f7a178-kube-api-access-wf2cd\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.168130 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"861cdd7d-b563-4009-9c33-a5c64d6ffae9","Type":"ContainerStarted","Data":"49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046"} Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.168671 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="ceilometer-central-agent" containerID="cri-o://2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f" gracePeriod=30 Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.168922 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="ceilometer-notification-agent" containerID="cri-o://4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2" gracePeriod=30 Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.168947 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.168964 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="sg-core" containerID="cri-o://d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff" gracePeriod=30 Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.169216 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="proxy-httpd" containerID="cri-o://49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046" gracePeriod=30 Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.171729 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154490f8-97ab-4703-a96c-16b6d5f7a178-config-data\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.175176 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-dp5x6" podStartSLOduration=3.418861012 podStartE2EDuration="42.175154395s" podCreationTimestamp="2026-03-09 13:40:41 +0000 UTC" firstStartedPulling="2026-03-09 13:40:43.324675096 +0000 UTC m=+1198.574847004" lastFinishedPulling="2026-03-09 13:41:22.080968479 +0000 UTC m=+1237.331140387" observedRunningTime="2026-03-09 13:41:23.15862092 +0000 UTC m=+1238.408792828" watchObservedRunningTime="2026-03-09 13:41:23.175154395 +0000 UTC m=+1238.425326323" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.212554 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.234719 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.242701 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-dns-svc\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.242901 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-config\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.242969 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-489r4\" (UniqueName: \"kubernetes.io/projected/c38cb253-b945-43b3-8dcd-209682d40f11-kube-api-access-489r4\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.243193 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-sb\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.246099 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-config\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.247101 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-sb\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.255331 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-nb\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.256384 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-nb\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.261049 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-dns-svc\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.302314 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.062579511 podStartE2EDuration="42.302287331s" podCreationTimestamp="2026-03-09 13:40:41 +0000 UTC" firstStartedPulling="2026-03-09 13:40:42.860709556 +0000 UTC m=+1198.110881464" lastFinishedPulling="2026-03-09 13:41:22.100417376 +0000 UTC m=+1237.350589284" observedRunningTime="2026-03-09 13:41:23.218898061 +0000 UTC m=+1238.469069969" watchObservedRunningTime="2026-03-09 13:41:23.302287331 +0000 UTC m=+1238.552459249" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.302805 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-489r4\" (UniqueName: \"kubernetes.io/projected/c38cb253-b945-43b3-8dcd-209682d40f11-kube-api-access-489r4\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.323872 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7dcdc6487b-j8w75"] Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.326337 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.332078 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.347538 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7dcdc6487b-j8w75"] Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.363844 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.461315 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data-custom\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.461917 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gstx5\" (UniqueName: \"kubernetes.io/projected/60a250ee-f349-4bab-b5b4-b402289210a6-kube-api-access-gstx5\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.461975 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60a250ee-f349-4bab-b5b4-b402289210a6-logs\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.462013 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-combined-ca-bundle\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.462043 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.567174 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gstx5\" (UniqueName: \"kubernetes.io/projected/60a250ee-f349-4bab-b5b4-b402289210a6-kube-api-access-gstx5\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.567312 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60a250ee-f349-4bab-b5b4-b402289210a6-logs\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.567352 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-combined-ca-bundle\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.567385 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.567434 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data-custom\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.568002 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60a250ee-f349-4bab-b5b4-b402289210a6-logs\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.577481 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data-custom\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.581313 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-combined-ca-bundle\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.584498 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.618101 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gstx5\" (UniqueName: \"kubernetes.io/projected/60a250ee-f349-4bab-b5b4-b402289210a6-kube-api-access-gstx5\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.684610 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.908350 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c6f54974-hws5g"] Mar 09 13:41:23 crc kubenswrapper[4764]: W0309 13:41:23.916985 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod154490f8_97ab_4703_a96c_16b6d5f7a178.slice/crio-f9d6f5454e96631579c882ada452e520ff7962cd96029c1aa57007ee607c3b72 WatchSource:0}: Error finding container f9d6f5454e96631579c882ada452e520ff7962cd96029c1aa57007ee607c3b72: Status 404 returned error can't find the container with id f9d6f5454e96631579c882ada452e520ff7962cd96029c1aa57007ee607c3b72 Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.988962 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-789c56cf69-2dj2c"] Mar 09 13:41:23 crc kubenswrapper[4764]: W0309 13:41:23.995145 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda18071d3_1164_4080_9095_919bb5349bb8.slice/crio-6368fca578eaa6b79a350a7b8be6f5443ef3b3a49140f98edc5341fb7208ad48 WatchSource:0}: Error finding container 6368fca578eaa6b79a350a7b8be6f5443ef3b3a49140f98edc5341fb7208ad48: Status 404 returned error can't find the container with id 6368fca578eaa6b79a350a7b8be6f5443ef3b3a49140f98edc5341fb7208ad48 Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.020420 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-qsc97"] Mar 09 13:41:24 crc kubenswrapper[4764]: W0309 13:41:24.023406 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc38cb253_b945_43b3_8dcd_209682d40f11.slice/crio-744dd94a1b4869c7dc37123bb898cb66f67f114bb38f59e8f906fc2e985184ea WatchSource:0}: Error finding container 744dd94a1b4869c7dc37123bb898cb66f67f114bb38f59e8f906fc2e985184ea: Status 404 returned error can't find the container with id 744dd94a1b4869c7dc37123bb898cb66f67f114bb38f59e8f906fc2e985184ea Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.179287 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-qsc97" event={"ID":"c38cb253-b945-43b3-8dcd-209682d40f11","Type":"ContainerStarted","Data":"744dd94a1b4869c7dc37123bb898cb66f67f114bb38f59e8f906fc2e985184ea"} Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.181721 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-789c56cf69-2dj2c" event={"ID":"a18071d3-1164-4080-9095-919bb5349bb8","Type":"ContainerStarted","Data":"6368fca578eaa6b79a350a7b8be6f5443ef3b3a49140f98edc5341fb7208ad48"} Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.184565 4764 generic.go:334] "Generic (PLEG): container finished" podID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerID="49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046" exitCode=0 Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.184588 4764 generic.go:334] "Generic (PLEG): container finished" podID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerID="d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff" exitCode=2 Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.184598 4764 generic.go:334] "Generic (PLEG): container finished" podID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerID="2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f" exitCode=0 Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.184627 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"861cdd7d-b563-4009-9c33-a5c64d6ffae9","Type":"ContainerDied","Data":"49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046"} Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.184663 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"861cdd7d-b563-4009-9c33-a5c64d6ffae9","Type":"ContainerDied","Data":"d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff"} Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.184675 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"861cdd7d-b563-4009-9c33-a5c64d6ffae9","Type":"ContainerDied","Data":"2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f"} Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.185662 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" event={"ID":"154490f8-97ab-4703-a96c-16b6d5f7a178","Type":"ContainerStarted","Data":"f9d6f5454e96631579c882ada452e520ff7962cd96029c1aa57007ee607c3b72"} Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.216868 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7dcdc6487b-j8w75"] Mar 09 13:41:24 crc kubenswrapper[4764]: W0309 13:41:24.222691 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60a250ee_f349_4bab_b5b4_b402289210a6.slice/crio-60454879be0f224ed5bb74217d2090bdcd6efc318e5799fb2111033ee5eacbe4 WatchSource:0}: Error finding container 60454879be0f224ed5bb74217d2090bdcd6efc318e5799fb2111033ee5eacbe4: Status 404 returned error can't find the container with id 60454879be0f224ed5bb74217d2090bdcd6efc318e5799fb2111033ee5eacbe4 Mar 09 13:41:25 crc kubenswrapper[4764]: I0309 13:41:25.213110 4764 generic.go:334] "Generic (PLEG): container finished" podID="c38cb253-b945-43b3-8dcd-209682d40f11" containerID="83e5b1aa14005ff710f52b631cae367c77a62e68a139a563a732645f67f44497" exitCode=0 Mar 09 13:41:25 crc kubenswrapper[4764]: I0309 13:41:25.213295 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-qsc97" event={"ID":"c38cb253-b945-43b3-8dcd-209682d40f11","Type":"ContainerDied","Data":"83e5b1aa14005ff710f52b631cae367c77a62e68a139a563a732645f67f44497"} Mar 09 13:41:25 crc kubenswrapper[4764]: I0309 13:41:25.248869 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dcdc6487b-j8w75" event={"ID":"60a250ee-f349-4bab-b5b4-b402289210a6","Type":"ContainerStarted","Data":"73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5"} Mar 09 13:41:25 crc kubenswrapper[4764]: I0309 13:41:25.248946 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dcdc6487b-j8w75" event={"ID":"60a250ee-f349-4bab-b5b4-b402289210a6","Type":"ContainerStarted","Data":"4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067"} Mar 09 13:41:25 crc kubenswrapper[4764]: I0309 13:41:25.248962 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dcdc6487b-j8w75" event={"ID":"60a250ee-f349-4bab-b5b4-b402289210a6","Type":"ContainerStarted","Data":"60454879be0f224ed5bb74217d2090bdcd6efc318e5799fb2111033ee5eacbe4"} Mar 09 13:41:25 crc kubenswrapper[4764]: I0309 13:41:25.249779 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:25 crc kubenswrapper[4764]: I0309 13:41:25.249839 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:25 crc kubenswrapper[4764]: I0309 13:41:25.283495 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7dcdc6487b-j8w75" podStartSLOduration=2.283480259 podStartE2EDuration="2.283480259s" podCreationTimestamp="2026-03-09 13:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:25.281412227 +0000 UTC m=+1240.531584125" watchObservedRunningTime="2026-03-09 13:41:25.283480259 +0000 UTC m=+1240.533652177" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.169270 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.233764 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-config-data\") pod \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.233916 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj2kg\" (UniqueName: \"kubernetes.io/projected/861cdd7d-b563-4009-9c33-a5c64d6ffae9-kube-api-access-mj2kg\") pod \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.233950 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-scripts\") pod \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.234063 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-combined-ca-bundle\") pod \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.234110 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-log-httpd\") pod \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.234166 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-sg-core-conf-yaml\") pod \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.234222 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-run-httpd\") pod \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.245265 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "861cdd7d-b563-4009-9c33-a5c64d6ffae9" (UID: "861cdd7d-b563-4009-9c33-a5c64d6ffae9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.261707 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "861cdd7d-b563-4009-9c33-a5c64d6ffae9" (UID: "861cdd7d-b563-4009-9c33-a5c64d6ffae9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.273826 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/861cdd7d-b563-4009-9c33-a5c64d6ffae9-kube-api-access-mj2kg" (OuterVolumeSpecName: "kube-api-access-mj2kg") pod "861cdd7d-b563-4009-9c33-a5c64d6ffae9" (UID: "861cdd7d-b563-4009-9c33-a5c64d6ffae9"). InnerVolumeSpecName "kube-api-access-mj2kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.324832 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-scripts" (OuterVolumeSpecName: "scripts") pod "861cdd7d-b563-4009-9c33-a5c64d6ffae9" (UID: "861cdd7d-b563-4009-9c33-a5c64d6ffae9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.338028 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj2kg\" (UniqueName: \"kubernetes.io/projected/861cdd7d-b563-4009-9c33-a5c64d6ffae9-kube-api-access-mj2kg\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.338054 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.338068 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.338078 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.368988 4764 generic.go:334] "Generic (PLEG): container finished" podID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerID="4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2" exitCode=0 Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.369941 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.370864 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"861cdd7d-b563-4009-9c33-a5c64d6ffae9","Type":"ContainerDied","Data":"4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2"} Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.370893 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"861cdd7d-b563-4009-9c33-a5c64d6ffae9","Type":"ContainerDied","Data":"6eab7083503ac11b2f955fc7b67907a3eb60734c7262ad63357465f6782429c0"} Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.370914 4764 scope.go:117] "RemoveContainer" containerID="49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.426505 4764 scope.go:117] "RemoveContainer" containerID="d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.432737 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "861cdd7d-b563-4009-9c33-a5c64d6ffae9" (UID: "861cdd7d-b563-4009-9c33-a5c64d6ffae9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.440894 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.483074 4764 scope.go:117] "RemoveContainer" containerID="4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.509154 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-94887676d-fp9dl"] Mar 09 13:41:26 crc kubenswrapper[4764]: E0309 13:41:26.509680 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="ceilometer-central-agent" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.510146 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="ceilometer-central-agent" Mar 09 13:41:26 crc kubenswrapper[4764]: E0309 13:41:26.510176 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="ceilometer-notification-agent" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.510186 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="ceilometer-notification-agent" Mar 09 13:41:26 crc kubenswrapper[4764]: E0309 13:41:26.510200 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="sg-core" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.510209 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="sg-core" Mar 09 13:41:26 crc kubenswrapper[4764]: E0309 13:41:26.510230 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="proxy-httpd" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.510239 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="proxy-httpd" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.510494 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="ceilometer-central-agent" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.510524 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="sg-core" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.510536 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="proxy-httpd" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.510550 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="ceilometer-notification-agent" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.515821 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.521374 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.521660 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.536743 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-94887676d-fp9dl"] Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.537691 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-config-data" (OuterVolumeSpecName: "config-data") pod "861cdd7d-b563-4009-9c33-a5c64d6ffae9" (UID: "861cdd7d-b563-4009-9c33-a5c64d6ffae9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.548022 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.578528 4764 scope.go:117] "RemoveContainer" containerID="2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.581568 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "861cdd7d-b563-4009-9c33-a5c64d6ffae9" (UID: "861cdd7d-b563-4009-9c33-a5c64d6ffae9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.612820 4764 scope.go:117] "RemoveContainer" containerID="49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046" Mar 09 13:41:26 crc kubenswrapper[4764]: E0309 13:41:26.613202 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046\": container with ID starting with 49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046 not found: ID does not exist" containerID="49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.613237 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046"} err="failed to get container status \"49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046\": rpc error: code = NotFound desc = could not find container \"49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046\": container with ID starting with 49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046 not found: ID does not exist" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.613270 4764 scope.go:117] "RemoveContainer" containerID="d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff" Mar 09 13:41:26 crc kubenswrapper[4764]: E0309 13:41:26.613597 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff\": container with ID starting with d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff not found: ID does not exist" containerID="d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.613663 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff"} err="failed to get container status \"d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff\": rpc error: code = NotFound desc = could not find container \"d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff\": container with ID starting with d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff not found: ID does not exist" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.613695 4764 scope.go:117] "RemoveContainer" containerID="4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2" Mar 09 13:41:26 crc kubenswrapper[4764]: E0309 13:41:26.614276 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2\": container with ID starting with 4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2 not found: ID does not exist" containerID="4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.614304 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2"} err="failed to get container status \"4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2\": rpc error: code = NotFound desc = could not find container \"4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2\": container with ID starting with 4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2 not found: ID does not exist" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.614320 4764 scope.go:117] "RemoveContainer" containerID="2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f" Mar 09 13:41:26 crc kubenswrapper[4764]: E0309 13:41:26.614573 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f\": container with ID starting with 2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f not found: ID does not exist" containerID="2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.614630 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f"} err="failed to get container status \"2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f\": rpc error: code = NotFound desc = could not find container \"2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f\": container with ID starting with 2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f not found: ID does not exist" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.651696 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-internal-tls-certs\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.651926 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-logs\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.652241 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-config-data\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.652295 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-config-data-custom\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.652369 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-public-tls-certs\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.653615 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfxg4\" (UniqueName: \"kubernetes.io/projected/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-kube-api-access-pfxg4\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.654024 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-combined-ca-bundle\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.654169 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.755822 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-logs\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.755936 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-config-data\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.756427 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-logs\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.755970 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-config-data-custom\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.757107 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-public-tls-certs\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.757625 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfxg4\" (UniqueName: \"kubernetes.io/projected/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-kube-api-access-pfxg4\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.758379 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-combined-ca-bundle\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.759101 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-internal-tls-certs\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.760135 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-config-data-custom\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.760846 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-config-data\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.762534 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-internal-tls-certs\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.764428 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-public-tls-certs\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.767821 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-combined-ca-bundle\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.779905 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfxg4\" (UniqueName: \"kubernetes.io/projected/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-kube-api-access-pfxg4\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.854091 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.894503 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.904123 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.924210 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.929499 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.938386 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.941190 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.967448 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.066701 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-scripts\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.067225 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-run-httpd\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.067263 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-config-data\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.067313 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.067356 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-log-httpd\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.067381 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpxx4\" (UniqueName: \"kubernetes.io/projected/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-kube-api-access-qpxx4\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.067420 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.169756 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-scripts\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.169874 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-run-httpd\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.169908 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-config-data\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.169964 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.170003 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-log-httpd\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.170023 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpxx4\" (UniqueName: \"kubernetes.io/projected/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-kube-api-access-qpxx4\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.170066 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.170499 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-run-httpd\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.170836 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-log-httpd\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.179671 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.179887 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-config-data\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.180030 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-scripts\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.183600 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.192326 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpxx4\" (UniqueName: \"kubernetes.io/projected/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-kube-api-access-qpxx4\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.313110 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.371093 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-94887676d-fp9dl"] Mar 09 13:41:27 crc kubenswrapper[4764]: W0309 13:41:27.385957 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8389bcb_fcb2_48b4_a1c2_3ae7427ecc19.slice/crio-d669121abd84d11c4a6a8316b10e175a9237b52ac50e78e7464435c0f73518ce WatchSource:0}: Error finding container d669121abd84d11c4a6a8316b10e175a9237b52ac50e78e7464435c0f73518ce: Status 404 returned error can't find the container with id d669121abd84d11c4a6a8316b10e175a9237b52ac50e78e7464435c0f73518ce Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.392984 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" event={"ID":"154490f8-97ab-4703-a96c-16b6d5f7a178","Type":"ContainerStarted","Data":"d4b3b72c9386bd5a9ede799e410a50128df2c3496a09b743dba392f2cc5e257f"} Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.393060 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" event={"ID":"154490f8-97ab-4703-a96c-16b6d5f7a178","Type":"ContainerStarted","Data":"4ae3d0da6781eabbb5adafca215ad12ad0c3d525f95ef600a036122539dea3c9"} Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.404845 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-qsc97" event={"ID":"c38cb253-b945-43b3-8dcd-209682d40f11","Type":"ContainerStarted","Data":"15010747611fb0fe00d92531331c8086dfb085efb07034db0f644ee0d2a65525"} Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.405779 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.419509 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" podStartSLOduration=3.247329244 podStartE2EDuration="5.419486129s" podCreationTimestamp="2026-03-09 13:41:22 +0000 UTC" firstStartedPulling="2026-03-09 13:41:23.927972964 +0000 UTC m=+1239.178144872" lastFinishedPulling="2026-03-09 13:41:26.100129849 +0000 UTC m=+1241.350301757" observedRunningTime="2026-03-09 13:41:27.411769585 +0000 UTC m=+1242.661941503" watchObservedRunningTime="2026-03-09 13:41:27.419486129 +0000 UTC m=+1242.669658037" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.439231 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-789c56cf69-2dj2c" event={"ID":"a18071d3-1164-4080-9095-919bb5349bb8","Type":"ContainerStarted","Data":"4672c16d453e1e274b6da68e97c12c8549633d1027aae998ed8d544a8dc9eae4"} Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.439309 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-789c56cf69-2dj2c" event={"ID":"a18071d3-1164-4080-9095-919bb5349bb8","Type":"ContainerStarted","Data":"6db9a396567078aa433f28579d87de04a68fe1f9ac000331309e7871a4e32d55"} Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.451291 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-869f779d85-qsc97" podStartSLOduration=4.451262735 podStartE2EDuration="4.451262735s" podCreationTimestamp="2026-03-09 13:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:27.439190312 +0000 UTC m=+1242.689362230" watchObservedRunningTime="2026-03-09 13:41:27.451262735 +0000 UTC m=+1242.701434633" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.474480 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-789c56cf69-2dj2c" podStartSLOduration=3.372097561 podStartE2EDuration="5.474448176s" podCreationTimestamp="2026-03-09 13:41:22 +0000 UTC" firstStartedPulling="2026-03-09 13:41:24.002529123 +0000 UTC m=+1239.252701031" lastFinishedPulling="2026-03-09 13:41:26.104879738 +0000 UTC m=+1241.355051646" observedRunningTime="2026-03-09 13:41:27.468926818 +0000 UTC m=+1242.719098746" watchObservedRunningTime="2026-03-09 13:41:27.474448176 +0000 UTC m=+1242.724620074" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.577517 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" path="/var/lib/kubelet/pods/861cdd7d-b563-4009-9c33-a5c64d6ffae9/volumes" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.823193 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:41:27 crc kubenswrapper[4764]: W0309 13:41:27.840581 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf5caebe_ad05_48b0_bbce_ecb2ec29e7c3.slice/crio-a9199722fbed2e4a6e2a95c99d206366b5d4467326eba28ca7c133400c787e70 WatchSource:0}: Error finding container a9199722fbed2e4a6e2a95c99d206366b5d4467326eba28ca7c133400c787e70: Status 404 returned error can't find the container with id a9199722fbed2e4a6e2a95c99d206366b5d4467326eba28ca7c133400c787e70 Mar 09 13:41:28 crc kubenswrapper[4764]: I0309 13:41:28.451119 4764 generic.go:334] "Generic (PLEG): container finished" podID="74146b7d-9780-4d2d-9454-853296f88955" containerID="1a80c7f5de2d815f10d1b147b52b1c923bf4e3266278afd841a98b5bc66a8ad6" exitCode=0 Mar 09 13:41:28 crc kubenswrapper[4764]: I0309 13:41:28.451205 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dp5x6" event={"ID":"74146b7d-9780-4d2d-9454-853296f88955","Type":"ContainerDied","Data":"1a80c7f5de2d815f10d1b147b52b1c923bf4e3266278afd841a98b5bc66a8ad6"} Mar 09 13:41:28 crc kubenswrapper[4764]: I0309 13:41:28.452333 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3","Type":"ContainerStarted","Data":"a9199722fbed2e4a6e2a95c99d206366b5d4467326eba28ca7c133400c787e70"} Mar 09 13:41:28 crc kubenswrapper[4764]: I0309 13:41:28.453949 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-94887676d-fp9dl" event={"ID":"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19","Type":"ContainerStarted","Data":"c1ae48c7a6db67e1b6ac1ce4507bf46720f7c1f1d15e14d2c397114aeca82582"} Mar 09 13:41:28 crc kubenswrapper[4764]: I0309 13:41:28.453996 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-94887676d-fp9dl" event={"ID":"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19","Type":"ContainerStarted","Data":"0afbaf2d7a7ce41452a514f65b61e8e6312ab49e880061ea621f68858d4cab47"} Mar 09 13:41:28 crc kubenswrapper[4764]: I0309 13:41:28.454008 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-94887676d-fp9dl" event={"ID":"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19","Type":"ContainerStarted","Data":"d669121abd84d11c4a6a8316b10e175a9237b52ac50e78e7464435c0f73518ce"} Mar 09 13:41:28 crc kubenswrapper[4764]: I0309 13:41:28.503373 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-94887676d-fp9dl" podStartSLOduration=2.503352495 podStartE2EDuration="2.503352495s" podCreationTimestamp="2026-03-09 13:41:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:28.493764295 +0000 UTC m=+1243.743936223" watchObservedRunningTime="2026-03-09 13:41:28.503352495 +0000 UTC m=+1243.753524403" Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.464211 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3","Type":"ContainerStarted","Data":"d86adcecb94ef745afb424352583b984c64f486d6d2dfabedad259db0634d772"} Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.464755 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.464776 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.868389 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.953574 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-scripts\") pod \"74146b7d-9780-4d2d-9454-853296f88955\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.953709 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-config-data\") pod \"74146b7d-9780-4d2d-9454-853296f88955\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.953789 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74146b7d-9780-4d2d-9454-853296f88955-etc-machine-id\") pod \"74146b7d-9780-4d2d-9454-853296f88955\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.953874 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khssq\" (UniqueName: \"kubernetes.io/projected/74146b7d-9780-4d2d-9454-853296f88955-kube-api-access-khssq\") pod \"74146b7d-9780-4d2d-9454-853296f88955\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.954036 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74146b7d-9780-4d2d-9454-853296f88955-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "74146b7d-9780-4d2d-9454-853296f88955" (UID: "74146b7d-9780-4d2d-9454-853296f88955"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.953901 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-db-sync-config-data\") pod \"74146b7d-9780-4d2d-9454-853296f88955\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.954696 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-combined-ca-bundle\") pod \"74146b7d-9780-4d2d-9454-853296f88955\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.955222 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74146b7d-9780-4d2d-9454-853296f88955-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.969389 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "74146b7d-9780-4d2d-9454-853296f88955" (UID: "74146b7d-9780-4d2d-9454-853296f88955"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.969726 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74146b7d-9780-4d2d-9454-853296f88955-kube-api-access-khssq" (OuterVolumeSpecName: "kube-api-access-khssq") pod "74146b7d-9780-4d2d-9454-853296f88955" (UID: "74146b7d-9780-4d2d-9454-853296f88955"). InnerVolumeSpecName "kube-api-access-khssq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.995926 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-scripts" (OuterVolumeSpecName: "scripts") pod "74146b7d-9780-4d2d-9454-853296f88955" (UID: "74146b7d-9780-4d2d-9454-853296f88955"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.056912 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74146b7d-9780-4d2d-9454-853296f88955" (UID: "74146b7d-9780-4d2d-9454-853296f88955"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.056953 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.057503 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khssq\" (UniqueName: \"kubernetes.io/projected/74146b7d-9780-4d2d-9454-853296f88955-kube-api-access-khssq\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.057524 4764 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.159199 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.168768 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-config-data" (OuterVolumeSpecName: "config-data") pod "74146b7d-9780-4d2d-9454-853296f88955" (UID: "74146b7d-9780-4d2d-9454-853296f88955"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.261447 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.483797 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dp5x6" event={"ID":"74146b7d-9780-4d2d-9454-853296f88955","Type":"ContainerDied","Data":"eedf3d5fa18d75dee38a49a615e9d2f0a831d8c6f13a33063d424dd9831c9a81"} Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.485768 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eedf3d5fa18d75dee38a49a615e9d2f0a831d8c6f13a33063d424dd9831c9a81" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.485053 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.489017 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3","Type":"ContainerStarted","Data":"4fe2306e68853bfa3180ad2e5295480a2103723960a9860736e8ad880fc1db61"} Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.489104 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3","Type":"ContainerStarted","Data":"1e8bef222d882b21a7ec25aed9429545abfc2c7a862b52718f01f100ae12b17d"} Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.794947 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:41:30 crc kubenswrapper[4764]: E0309 13:41:30.795906 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74146b7d-9780-4d2d-9454-853296f88955" containerName="cinder-db-sync" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.795926 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="74146b7d-9780-4d2d-9454-853296f88955" containerName="cinder-db-sync" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.796118 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="74146b7d-9780-4d2d-9454-853296f88955" containerName="cinder-db-sync" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.797946 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.805043 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.805829 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.807520 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.807774 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-4vt4n" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.808391 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.878284 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.883901 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.884252 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rsb9\" (UniqueName: \"kubernetes.io/projected/6afeb6a7-a0a0-40de-90ee-97b497663798-kube-api-access-5rsb9\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.884451 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.884618 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-scripts\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.885139 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6afeb6a7-a0a0-40de-90ee-97b497663798-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.960959 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-qsc97"] Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.977208 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-869f779d85-qsc97" podUID="c38cb253-b945-43b3-8dcd-209682d40f11" containerName="dnsmasq-dns" containerID="cri-o://15010747611fb0fe00d92531331c8086dfb085efb07034db0f644ee0d2a65525" gracePeriod=10 Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.988877 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rsb9\" (UniqueName: \"kubernetes.io/projected/6afeb6a7-a0a0-40de-90ee-97b497663798-kube-api-access-5rsb9\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.988961 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.988992 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-scripts\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.989080 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6afeb6a7-a0a0-40de-90ee-97b497663798-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.989122 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.989147 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.991033 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6afeb6a7-a0a0-40de-90ee-97b497663798-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.999404 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.000203 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-scripts\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.009385 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.014792 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.014898 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-d2v25"] Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.016791 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.023818 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-d2v25"] Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.034706 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rsb9\" (UniqueName: \"kubernetes.io/projected/6afeb6a7-a0a0-40de-90ee-97b497663798-kube-api-access-5rsb9\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.101013 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-sb\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.101129 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcpch\" (UniqueName: \"kubernetes.io/projected/4e886001-842a-4f97-b4c3-d088d80e6a45-kube-api-access-zcpch\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.101193 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-nb\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.101247 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-config\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.101288 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-dns-svc\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.195809 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.203395 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-nb\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.203512 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-config\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.203566 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-dns-svc\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.203610 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-sb\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.203689 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcpch\" (UniqueName: \"kubernetes.io/projected/4e886001-842a-4f97-b4c3-d088d80e6a45-kube-api-access-zcpch\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.204789 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-dns-svc\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.204841 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-nb\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.206179 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-config\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.206331 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-sb\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.240466 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcpch\" (UniqueName: \"kubernetes.io/projected/4e886001-842a-4f97-b4c3-d088d80e6a45-kube-api-access-zcpch\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.253286 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.254831 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.277090 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.305622 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-scripts\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.305734 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c43805e-424a-4820-924b-314b3e2f0a84-logs\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.306158 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c43805e-424a-4820-924b-314b3e2f0a84-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.306358 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data-custom\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.306440 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz6zt\" (UniqueName: \"kubernetes.io/projected/8c43805e-424a-4820-924b-314b3e2f0a84-kube-api-access-zz6zt\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.306544 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.306574 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.314997 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.414347 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c43805e-424a-4820-924b-314b3e2f0a84-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.414435 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c43805e-424a-4820-924b-314b3e2f0a84-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.415541 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data-custom\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.415574 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz6zt\" (UniqueName: \"kubernetes.io/projected/8c43805e-424a-4820-924b-314b3e2f0a84-kube-api-access-zz6zt\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.415607 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.415627 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.415689 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-scripts\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.415737 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c43805e-424a-4820-924b-314b3e2f0a84-logs\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.416237 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c43805e-424a-4820-924b-314b3e2f0a84-logs\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.422494 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.423255 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data-custom\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.424291 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-scripts\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.430929 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.462343 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz6zt\" (UniqueName: \"kubernetes.io/projected/8c43805e-424a-4820-924b-314b3e2f0a84-kube-api-access-zz6zt\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.505322 4764 generic.go:334] "Generic (PLEG): container finished" podID="c38cb253-b945-43b3-8dcd-209682d40f11" containerID="15010747611fb0fe00d92531331c8086dfb085efb07034db0f644ee0d2a65525" exitCode=0 Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.505368 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-qsc97" event={"ID":"c38cb253-b945-43b3-8dcd-209682d40f11","Type":"ContainerDied","Data":"15010747611fb0fe00d92531331c8086dfb085efb07034db0f644ee0d2a65525"} Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.530462 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.620106 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.702586 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.734880 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-nb\") pod \"c38cb253-b945-43b3-8dcd-209682d40f11\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.735042 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-489r4\" (UniqueName: \"kubernetes.io/projected/c38cb253-b945-43b3-8dcd-209682d40f11-kube-api-access-489r4\") pod \"c38cb253-b945-43b3-8dcd-209682d40f11\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.735147 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-config\") pod \"c38cb253-b945-43b3-8dcd-209682d40f11\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.735254 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-dns-svc\") pod \"c38cb253-b945-43b3-8dcd-209682d40f11\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.735374 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-sb\") pod \"c38cb253-b945-43b3-8dcd-209682d40f11\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.761322 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c38cb253-b945-43b3-8dcd-209682d40f11-kube-api-access-489r4" (OuterVolumeSpecName: "kube-api-access-489r4") pod "c38cb253-b945-43b3-8dcd-209682d40f11" (UID: "c38cb253-b945-43b3-8dcd-209682d40f11"). InnerVolumeSpecName "kube-api-access-489r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.817103 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c38cb253-b945-43b3-8dcd-209682d40f11" (UID: "c38cb253-b945-43b3-8dcd-209682d40f11"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.833093 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c38cb253-b945-43b3-8dcd-209682d40f11" (UID: "c38cb253-b945-43b3-8dcd-209682d40f11"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.840946 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-489r4\" (UniqueName: \"kubernetes.io/projected/c38cb253-b945-43b3-8dcd-209682d40f11-kube-api-access-489r4\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.840993 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.841003 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.877162 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-config" (OuterVolumeSpecName: "config") pod "c38cb253-b945-43b3-8dcd-209682d40f11" (UID: "c38cb253-b945-43b3-8dcd-209682d40f11"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.906180 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c38cb253-b945-43b3-8dcd-209682d40f11" (UID: "c38cb253-b945-43b3-8dcd-209682d40f11"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.944846 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.944906 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.944938 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:31 crc kubenswrapper[4764]: W0309 13:41:31.958413 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6afeb6a7_a0a0_40de_90ee_97b497663798.slice/crio-ca431b92c1671db841cf4cedf0d25ffe1db3212b3b5515413a5c717be63cab9f WatchSource:0}: Error finding container ca431b92c1671db841cf4cedf0d25ffe1db3212b3b5515413a5c717be63cab9f: Status 404 returned error can't find the container with id ca431b92c1671db841cf4cedf0d25ffe1db3212b3b5515413a5c717be63cab9f Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.239557 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-d2v25"] Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.373123 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.579354 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-qsc97" event={"ID":"c38cb253-b945-43b3-8dcd-209682d40f11","Type":"ContainerDied","Data":"744dd94a1b4869c7dc37123bb898cb66f67f114bb38f59e8f906fc2e985184ea"} Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.579372 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.579799 4764 scope.go:117] "RemoveContainer" containerID="15010747611fb0fe00d92531331c8086dfb085efb07034db0f644ee0d2a65525" Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.594895 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6afeb6a7-a0a0-40de-90ee-97b497663798","Type":"ContainerStarted","Data":"ca431b92c1671db841cf4cedf0d25ffe1db3212b3b5515413a5c717be63cab9f"} Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.647155 4764 generic.go:334] "Generic (PLEG): container finished" podID="4e886001-842a-4f97-b4c3-d088d80e6a45" containerID="d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01" exitCode=0 Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.647320 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" event={"ID":"4e886001-842a-4f97-b4c3-d088d80e6a45","Type":"ContainerDied","Data":"d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01"} Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.647364 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" event={"ID":"4e886001-842a-4f97-b4c3-d088d80e6a45","Type":"ContainerStarted","Data":"f8318b8e268cec9ccfcf591135ec8e9761aa9bf10f09e2ff5ebd0b76bbd7c843"} Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.699816 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-qsc97"] Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.704073 4764 scope.go:117] "RemoveContainer" containerID="83e5b1aa14005ff710f52b631cae367c77a62e68a139a563a732645f67f44497" Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.760169 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8c43805e-424a-4820-924b-314b3e2f0a84","Type":"ContainerStarted","Data":"ecb76bfd718406ffe153b8cac9fb315762aaa65f4ab170d3da19c41066826c56"} Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.796743 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-qsc97"] Mar 09 13:41:33 crc kubenswrapper[4764]: I0309 13:41:33.580416 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c38cb253-b945-43b3-8dcd-209682d40f11" path="/var/lib/kubelet/pods/c38cb253-b945-43b3-8dcd-209682d40f11/volumes" Mar 09 13:41:33 crc kubenswrapper[4764]: I0309 13:41:33.587374 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:41:33 crc kubenswrapper[4764]: I0309 13:41:33.829767 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" event={"ID":"4e886001-842a-4f97-b4c3-d088d80e6a45","Type":"ContainerStarted","Data":"f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434"} Mar 09 13:41:33 crc kubenswrapper[4764]: I0309 13:41:33.829940 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:33 crc kubenswrapper[4764]: I0309 13:41:33.832336 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8c43805e-424a-4820-924b-314b3e2f0a84","Type":"ContainerStarted","Data":"ddff3fcd534d4f115f4a054a32d5b0f2fe7f9d7b7685de4b4a6fc84a82659da4"} Mar 09 13:41:33 crc kubenswrapper[4764]: I0309 13:41:33.836764 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3","Type":"ContainerStarted","Data":"8e00a9b64763a83494a83ac4067c14a67cca49b0a2f0a325e393f9400b39892b"} Mar 09 13:41:33 crc kubenswrapper[4764]: I0309 13:41:33.837390 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:41:33 crc kubenswrapper[4764]: I0309 13:41:33.862272 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" podStartSLOduration=3.862246486 podStartE2EDuration="3.862246486s" podCreationTimestamp="2026-03-09 13:41:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:33.846282616 +0000 UTC m=+1249.096454534" watchObservedRunningTime="2026-03-09 13:41:33.862246486 +0000 UTC m=+1249.112418394" Mar 09 13:41:34 crc kubenswrapper[4764]: I0309 13:41:34.856835 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6afeb6a7-a0a0-40de-90ee-97b497663798","Type":"ContainerStarted","Data":"3048f7a722fe7b1924f522a8fa95862bc27ca27611f2c6a75277ad4a82b3a7d9"} Mar 09 13:41:34 crc kubenswrapper[4764]: I0309 13:41:34.862744 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8c43805e-424a-4820-924b-314b3e2f0a84","Type":"ContainerStarted","Data":"e3bfc05e3d43a60e9bf0152b630c3da61e89fcd4bab24c5c69069ee06ec1458b"} Mar 09 13:41:34 crc kubenswrapper[4764]: I0309 13:41:34.862790 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8c43805e-424a-4820-924b-314b3e2f0a84" containerName="cinder-api-log" containerID="cri-o://ddff3fcd534d4f115f4a054a32d5b0f2fe7f9d7b7685de4b4a6fc84a82659da4" gracePeriod=30 Mar 09 13:41:34 crc kubenswrapper[4764]: I0309 13:41:34.862900 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8c43805e-424a-4820-924b-314b3e2f0a84" containerName="cinder-api" containerID="cri-o://e3bfc05e3d43a60e9bf0152b630c3da61e89fcd4bab24c5c69069ee06ec1458b" gracePeriod=30 Mar 09 13:41:34 crc kubenswrapper[4764]: I0309 13:41:34.862919 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 09 13:41:34 crc kubenswrapper[4764]: I0309 13:41:34.898367 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.898327675 podStartE2EDuration="3.898327675s" podCreationTimestamp="2026-03-09 13:41:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:34.883822712 +0000 UTC m=+1250.133994620" watchObservedRunningTime="2026-03-09 13:41:34.898327675 +0000 UTC m=+1250.148499583" Mar 09 13:41:34 crc kubenswrapper[4764]: I0309 13:41:34.907199 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.3282932689999996 podStartE2EDuration="8.907166577s" podCreationTimestamp="2026-03-09 13:41:26 +0000 UTC" firstStartedPulling="2026-03-09 13:41:27.84361533 +0000 UTC m=+1243.093787238" lastFinishedPulling="2026-03-09 13:41:32.422488628 +0000 UTC m=+1247.672660546" observedRunningTime="2026-03-09 13:41:33.886214346 +0000 UTC m=+1249.136386254" watchObservedRunningTime="2026-03-09 13:41:34.907166577 +0000 UTC m=+1250.157338485" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.300365 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.646086 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c97985d69-khcvg"] Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.655214 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c97985d69-khcvg" podUID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerName="neutron-api" containerID="cri-o://af86c5d555559ee78e3558628ecf704c4c51493754120be1f860b79edb606896" gracePeriod=30 Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.656818 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c97985d69-khcvg" podUID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerName="neutron-httpd" containerID="cri-o://cff13fa571b5cb0d7d8cf860c4a1efdb55518757df6a7508bbd7dca053bc0e44" gracePeriod=30 Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.728465 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6b7bfdfd5-56dnz"] Mar 09 13:41:35 crc kubenswrapper[4764]: E0309 13:41:35.728953 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c38cb253-b945-43b3-8dcd-209682d40f11" containerName="dnsmasq-dns" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.728971 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c38cb253-b945-43b3-8dcd-209682d40f11" containerName="dnsmasq-dns" Mar 09 13:41:35 crc kubenswrapper[4764]: E0309 13:41:35.728985 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c38cb253-b945-43b3-8dcd-209682d40f11" containerName="init" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.728992 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c38cb253-b945-43b3-8dcd-209682d40f11" containerName="init" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.729194 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c38cb253-b945-43b3-8dcd-209682d40f11" containerName="dnsmasq-dns" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.730288 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.766458 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b7bfdfd5-56dnz"] Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.781778 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-httpd-config\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.781832 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-config\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.781870 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-internal-tls-certs\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.781921 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8f9s\" (UniqueName: \"kubernetes.io/projected/fd7dadfc-b8e4-479f-8880-4ffeec051d30-kube-api-access-h8f9s\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.781951 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-ovndb-tls-certs\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.782002 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-public-tls-certs\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.782022 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-combined-ca-bundle\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.911534 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-public-tls-certs\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.911609 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-combined-ca-bundle\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.911732 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-httpd-config\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.911794 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-config\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.911857 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-internal-tls-certs\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.911938 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8f9s\" (UniqueName: \"kubernetes.io/projected/fd7dadfc-b8e4-479f-8880-4ffeec051d30-kube-api-access-h8f9s\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.912000 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-ovndb-tls-certs\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.920630 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-ovndb-tls-certs\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.923991 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-httpd-config\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.927160 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-config\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.941338 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.968266 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-internal-tls-certs\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.973855 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-public-tls-certs\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.979382 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8f9s\" (UniqueName: \"kubernetes.io/projected/fd7dadfc-b8e4-479f-8880-4ffeec051d30-kube-api-access-h8f9s\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.981584 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-combined-ca-bundle\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.985344 4764 generic.go:334] "Generic (PLEG): container finished" podID="8c43805e-424a-4820-924b-314b3e2f0a84" containerID="e3bfc05e3d43a60e9bf0152b630c3da61e89fcd4bab24c5c69069ee06ec1458b" exitCode=0 Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.985369 4764 generic.go:334] "Generic (PLEG): container finished" podID="8c43805e-424a-4820-924b-314b3e2f0a84" containerID="ddff3fcd534d4f115f4a054a32d5b0f2fe7f9d7b7685de4b4a6fc84a82659da4" exitCode=143 Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.985476 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8c43805e-424a-4820-924b-314b3e2f0a84","Type":"ContainerDied","Data":"e3bfc05e3d43a60e9bf0152b630c3da61e89fcd4bab24c5c69069ee06ec1458b"} Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.985509 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8c43805e-424a-4820-924b-314b3e2f0a84","Type":"ContainerDied","Data":"ddff3fcd534d4f115f4a054a32d5b0f2fe7f9d7b7685de4b4a6fc84a82659da4"} Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.035888 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6afeb6a7-a0a0-40de-90ee-97b497663798","Type":"ContainerStarted","Data":"dd358903bd4bede8bf029bbf11c70e3e47fe50f447bead2b369061bf66fe7262"} Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.089109 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.122871 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.988840153 podStartE2EDuration="6.122847117s" podCreationTimestamp="2026-03-09 13:41:30 +0000 UTC" firstStartedPulling="2026-03-09 13:41:31.962365485 +0000 UTC m=+1247.212537393" lastFinishedPulling="2026-03-09 13:41:33.096372449 +0000 UTC m=+1248.346544357" observedRunningTime="2026-03-09 13:41:36.107267197 +0000 UTC m=+1251.357439105" watchObservedRunningTime="2026-03-09 13:41:36.122847117 +0000 UTC m=+1251.373019025" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.198009 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.447086 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.518276 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.570622 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-scripts\") pod \"8c43805e-424a-4820-924b-314b3e2f0a84\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.570722 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c43805e-424a-4820-924b-314b3e2f0a84-etc-machine-id\") pod \"8c43805e-424a-4820-924b-314b3e2f0a84\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.570750 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data\") pod \"8c43805e-424a-4820-924b-314b3e2f0a84\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.570840 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data-custom\") pod \"8c43805e-424a-4820-924b-314b3e2f0a84\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.570906 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c43805e-424a-4820-924b-314b3e2f0a84-logs\") pod \"8c43805e-424a-4820-924b-314b3e2f0a84\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.571186 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-combined-ca-bundle\") pod \"8c43805e-424a-4820-924b-314b3e2f0a84\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.571231 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz6zt\" (UniqueName: \"kubernetes.io/projected/8c43805e-424a-4820-924b-314b3e2f0a84-kube-api-access-zz6zt\") pod \"8c43805e-424a-4820-924b-314b3e2f0a84\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.574871 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c43805e-424a-4820-924b-314b3e2f0a84-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8c43805e-424a-4820-924b-314b3e2f0a84" (UID: "8c43805e-424a-4820-924b-314b3e2f0a84"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.577851 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c43805e-424a-4820-924b-314b3e2f0a84-logs" (OuterVolumeSpecName: "logs") pod "8c43805e-424a-4820-924b-314b3e2f0a84" (UID: "8c43805e-424a-4820-924b-314b3e2f0a84"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.580158 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c43805e-424a-4820-924b-314b3e2f0a84-kube-api-access-zz6zt" (OuterVolumeSpecName: "kube-api-access-zz6zt") pod "8c43805e-424a-4820-924b-314b3e2f0a84" (UID: "8c43805e-424a-4820-924b-314b3e2f0a84"). InnerVolumeSpecName "kube-api-access-zz6zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.586918 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-scripts" (OuterVolumeSpecName: "scripts") pod "8c43805e-424a-4820-924b-314b3e2f0a84" (UID: "8c43805e-424a-4820-924b-314b3e2f0a84"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.590920 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8c43805e-424a-4820-924b-314b3e2f0a84" (UID: "8c43805e-424a-4820-924b-314b3e2f0a84"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.641257 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c43805e-424a-4820-924b-314b3e2f0a84" (UID: "8c43805e-424a-4820-924b-314b3e2f0a84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.643254 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data" (OuterVolumeSpecName: "config-data") pod "8c43805e-424a-4820-924b-314b3e2f0a84" (UID: "8c43805e-424a-4820-924b-314b3e2f0a84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.676140 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.676171 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz6zt\" (UniqueName: \"kubernetes.io/projected/8c43805e-424a-4820-924b-314b3e2f0a84-kube-api-access-zz6zt\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.676183 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.676192 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c43805e-424a-4820-924b-314b3e2f0a84-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.676202 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.676213 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.676222 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c43805e-424a-4820-924b-314b3e2f0a84-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.783442 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.057200 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.057731 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8c43805e-424a-4820-924b-314b3e2f0a84","Type":"ContainerDied","Data":"ecb76bfd718406ffe153b8cac9fb315762aaa65f4ab170d3da19c41066826c56"} Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.057824 4764 scope.go:117] "RemoveContainer" containerID="e3bfc05e3d43a60e9bf0152b630c3da61e89fcd4bab24c5c69069ee06ec1458b" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.122952 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.137730 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.152013 4764 scope.go:117] "RemoveContainer" containerID="ddff3fcd534d4f115f4a054a32d5b0f2fe7f9d7b7685de4b4a6fc84a82659da4" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.156901 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:41:37 crc kubenswrapper[4764]: E0309 13:41:37.157412 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c43805e-424a-4820-924b-314b3e2f0a84" containerName="cinder-api" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.157429 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c43805e-424a-4820-924b-314b3e2f0a84" containerName="cinder-api" Mar 09 13:41:37 crc kubenswrapper[4764]: E0309 13:41:37.157453 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c43805e-424a-4820-924b-314b3e2f0a84" containerName="cinder-api-log" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.157459 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c43805e-424a-4820-924b-314b3e2f0a84" containerName="cinder-api-log" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.157684 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c43805e-424a-4820-924b-314b3e2f0a84" containerName="cinder-api" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.157737 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c43805e-424a-4820-924b-314b3e2f0a84" containerName="cinder-api-log" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.160546 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.165323 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.165880 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.166049 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.186249 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.298991 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-config-data-custom\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.299066 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfrtf\" (UniqueName: \"kubernetes.io/projected/9cf43ab7-e625-4ffa-9af4-9f810a43d270-kube-api-access-rfrtf\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.299137 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.299185 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.299212 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-config-data\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.299431 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-scripts\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.299512 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9cf43ab7-e625-4ffa-9af4-9f810a43d270-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.299559 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cf43ab7-e625-4ffa-9af4-9f810a43d270-logs\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.299693 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.347913 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b7bfdfd5-56dnz"] Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.401416 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9cf43ab7-e625-4ffa-9af4-9f810a43d270-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.401924 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cf43ab7-e625-4ffa-9af4-9f810a43d270-logs\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.401966 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.402060 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-config-data-custom\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.402092 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfrtf\" (UniqueName: \"kubernetes.io/projected/9cf43ab7-e625-4ffa-9af4-9f810a43d270-kube-api-access-rfrtf\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.402160 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.402280 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-config-data\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.402303 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.402329 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-scripts\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.401801 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9cf43ab7-e625-4ffa-9af4-9f810a43d270-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.403087 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cf43ab7-e625-4ffa-9af4-9f810a43d270-logs\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.408527 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-scripts\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.415178 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.415466 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.415771 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-config-data-custom\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.430974 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.431069 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-config-data\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.438206 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfrtf\" (UniqueName: \"kubernetes.io/projected/9cf43ab7-e625-4ffa-9af4-9f810a43d270-kube-api-access-rfrtf\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.541199 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.552373 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6c97985d69-khcvg" podUID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.146:9696/\": dial tcp 10.217.0.146:9696: connect: connection refused" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.583853 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c43805e-424a-4820-924b-314b3e2f0a84" path="/var/lib/kubelet/pods/8c43805e-424a-4820-924b-314b3e2f0a84/volumes" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.726950 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7dcdc6487b-j8w75" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.096930 4764 generic.go:334] "Generic (PLEG): container finished" podID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerID="cff13fa571b5cb0d7d8cf860c4a1efdb55518757df6a7508bbd7dca053bc0e44" exitCode=0 Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.097446 4764 generic.go:334] "Generic (PLEG): container finished" podID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerID="af86c5d555559ee78e3558628ecf704c4c51493754120be1f860b79edb606896" exitCode=0 Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.097594 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c97985d69-khcvg" event={"ID":"5d65fa53-be02-4d11-b300-5cb4629c03da","Type":"ContainerDied","Data":"cff13fa571b5cb0d7d8cf860c4a1efdb55518757df6a7508bbd7dca053bc0e44"} Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.097658 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c97985d69-khcvg" event={"ID":"5d65fa53-be02-4d11-b300-5cb4629c03da","Type":"ContainerDied","Data":"af86c5d555559ee78e3558628ecf704c4c51493754120be1f860b79edb606896"} Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.111084 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b7bfdfd5-56dnz" event={"ID":"fd7dadfc-b8e4-479f-8880-4ffeec051d30","Type":"ContainerStarted","Data":"6f8188cfb291dea06d23e38b0677f28aab694b1f88c37dae466403569d1a0201"} Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.111153 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b7bfdfd5-56dnz" event={"ID":"fd7dadfc-b8e4-479f-8880-4ffeec051d30","Type":"ContainerStarted","Data":"1529a25d88fc576b021e03beb3031e5bc9556086c62d711281315ccd609d8726"} Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.511385 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:41:38 crc kubenswrapper[4764]: W0309 13:41:38.523789 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cf43ab7_e625_4ffa_9af4_9f810a43d270.slice/crio-16e6a56dec1e899bdf9119ed1f909410e74b18a5610ad6a7a75a8cc5bd666b78 WatchSource:0}: Error finding container 16e6a56dec1e899bdf9119ed1f909410e74b18a5610ad6a7a75a8cc5bd666b78: Status 404 returned error can't find the container with id 16e6a56dec1e899bdf9119ed1f909410e74b18a5610ad6a7a75a8cc5bd666b78 Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.656857 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.745442 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-public-tls-certs\") pod \"5d65fa53-be02-4d11-b300-5cb4629c03da\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.761173 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnnlq\" (UniqueName: \"kubernetes.io/projected/5d65fa53-be02-4d11-b300-5cb4629c03da-kube-api-access-bnnlq\") pod \"5d65fa53-be02-4d11-b300-5cb4629c03da\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.761281 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-internal-tls-certs\") pod \"5d65fa53-be02-4d11-b300-5cb4629c03da\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.761382 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-ovndb-tls-certs\") pod \"5d65fa53-be02-4d11-b300-5cb4629c03da\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.761458 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-config\") pod \"5d65fa53-be02-4d11-b300-5cb4629c03da\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.761608 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-httpd-config\") pod \"5d65fa53-be02-4d11-b300-5cb4629c03da\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.761716 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-combined-ca-bundle\") pod \"5d65fa53-be02-4d11-b300-5cb4629c03da\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.784148 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d65fa53-be02-4d11-b300-5cb4629c03da-kube-api-access-bnnlq" (OuterVolumeSpecName: "kube-api-access-bnnlq") pod "5d65fa53-be02-4d11-b300-5cb4629c03da" (UID: "5d65fa53-be02-4d11-b300-5cb4629c03da"). InnerVolumeSpecName "kube-api-access-bnnlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.798840 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "5d65fa53-be02-4d11-b300-5cb4629c03da" (UID: "5d65fa53-be02-4d11-b300-5cb4629c03da"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.845049 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5d65fa53-be02-4d11-b300-5cb4629c03da" (UID: "5d65fa53-be02-4d11-b300-5cb4629c03da"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.859410 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d65fa53-be02-4d11-b300-5cb4629c03da" (UID: "5d65fa53-be02-4d11-b300-5cb4629c03da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.860844 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-config" (OuterVolumeSpecName: "config") pod "5d65fa53-be02-4d11-b300-5cb4629c03da" (UID: "5d65fa53-be02-4d11-b300-5cb4629c03da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.863956 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.863987 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.863998 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.864007 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnnlq\" (UniqueName: \"kubernetes.io/projected/5d65fa53-be02-4d11-b300-5cb4629c03da-kube-api-access-bnnlq\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.864016 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.864583 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5d65fa53-be02-4d11-b300-5cb4629c03da" (UID: "5d65fa53-be02-4d11-b300-5cb4629c03da"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.888824 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "5d65fa53-be02-4d11-b300-5cb4629c03da" (UID: "5d65fa53-be02-4d11-b300-5cb4629c03da"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.970375 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.974707 4764 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.124456 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c97985d69-khcvg" event={"ID":"5d65fa53-be02-4d11-b300-5cb4629c03da","Type":"ContainerDied","Data":"63c3a5316fc5ec3fc301ebe753725e716ab876289ed6944f416ebd4b78894abb"} Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.124525 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.124538 4764 scope.go:117] "RemoveContainer" containerID="cff13fa571b5cb0d7d8cf860c4a1efdb55518757df6a7508bbd7dca053bc0e44" Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.141431 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9cf43ab7-e625-4ffa-9af4-9f810a43d270","Type":"ContainerStarted","Data":"16e6a56dec1e899bdf9119ed1f909410e74b18a5610ad6a7a75a8cc5bd666b78"} Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.154813 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b7bfdfd5-56dnz" event={"ID":"fd7dadfc-b8e4-479f-8880-4ffeec051d30","Type":"ContainerStarted","Data":"4d891998d26f933839c81257749a0591a2f852656838ddfe2f49f0007eedf1e7"} Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.155816 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.163572 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c97985d69-khcvg"] Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.174590 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6c97985d69-khcvg"] Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.192863 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6b7bfdfd5-56dnz" podStartSLOduration=4.192837026 podStartE2EDuration="4.192837026s" podCreationTimestamp="2026-03-09 13:41:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:39.192457256 +0000 UTC m=+1254.442629164" watchObservedRunningTime="2026-03-09 13:41:39.192837026 +0000 UTC m=+1254.443008934" Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.281960 4764 scope.go:117] "RemoveContainer" containerID="af86c5d555559ee78e3558628ecf704c4c51493754120be1f860b79edb606896" Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.513529 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.586328 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d65fa53-be02-4d11-b300-5cb4629c03da" path="/var/lib/kubelet/pods/5d65fa53-be02-4d11-b300-5cb4629c03da/volumes" Mar 09 13:41:40 crc kubenswrapper[4764]: I0309 13:41:40.128903 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:40 crc kubenswrapper[4764]: I0309 13:41:40.173686 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9cf43ab7-e625-4ffa-9af4-9f810a43d270","Type":"ContainerStarted","Data":"e31a2bbf58badcb3152b918a495470f759eaa31d486f175e453e447ec70b971f"} Mar 09 13:41:40 crc kubenswrapper[4764]: I0309 13:41:40.173744 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9cf43ab7-e625-4ffa-9af4-9f810a43d270","Type":"ContainerStarted","Data":"5ca851280bdd40db8d723ce3b940914a960b436b472d7f96ef6e9363f1f7d55a"} Mar 09 13:41:40 crc kubenswrapper[4764]: I0309 13:41:40.173854 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 09 13:41:40 crc kubenswrapper[4764]: I0309 13:41:40.202924 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.202905274 podStartE2EDuration="3.202905274s" podCreationTimestamp="2026-03-09 13:41:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:40.193314303 +0000 UTC m=+1255.443486231" watchObservedRunningTime="2026-03-09 13:41:40.202905274 +0000 UTC m=+1255.453077182" Mar 09 13:41:40 crc kubenswrapper[4764]: I0309 13:41:40.230813 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7dcdc6487b-j8w75"] Mar 09 13:41:40 crc kubenswrapper[4764]: I0309 13:41:40.231073 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7dcdc6487b-j8w75" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api-log" containerID="cri-o://4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067" gracePeriod=30 Mar 09 13:41:40 crc kubenswrapper[4764]: I0309 13:41:40.231213 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7dcdc6487b-j8w75" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api" containerID="cri-o://73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5" gracePeriod=30 Mar 09 13:41:41 crc kubenswrapper[4764]: I0309 13:41:41.189532 4764 generic.go:334] "Generic (PLEG): container finished" podID="60a250ee-f349-4bab-b5b4-b402289210a6" containerID="4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067" exitCode=143 Mar 09 13:41:41 crc kubenswrapper[4764]: I0309 13:41:41.189612 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dcdc6487b-j8w75" event={"ID":"60a250ee-f349-4bab-b5b4-b402289210a6","Type":"ContainerDied","Data":"4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067"} Mar 09 13:41:41 crc kubenswrapper[4764]: I0309 13:41:41.466000 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 09 13:41:41 crc kubenswrapper[4764]: I0309 13:41:41.516655 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:41:41 crc kubenswrapper[4764]: I0309 13:41:41.532689 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:41 crc kubenswrapper[4764]: I0309 13:41:41.669670 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-sh5rn"] Mar 09 13:41:41 crc kubenswrapper[4764]: I0309 13:41:41.669949 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" podUID="5b2b268a-adc9-46ca-908a-d30ab8543059" containerName="dnsmasq-dns" containerID="cri-o://20dab72734e5903795d478bac44970ac57f0690f8f454dee9dae223fbd3e92ac" gracePeriod=10 Mar 09 13:41:41 crc kubenswrapper[4764]: I0309 13:41:41.719850 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:41 crc kubenswrapper[4764]: I0309 13:41:41.958551 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.205726 4764 generic.go:334] "Generic (PLEG): container finished" podID="5b2b268a-adc9-46ca-908a-d30ab8543059" containerID="20dab72734e5903795d478bac44970ac57f0690f8f454dee9dae223fbd3e92ac" exitCode=0 Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.206065 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6afeb6a7-a0a0-40de-90ee-97b497663798" containerName="cinder-scheduler" containerID="cri-o://3048f7a722fe7b1924f522a8fa95862bc27ca27611f2c6a75277ad4a82b3a7d9" gracePeriod=30 Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.206500 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" event={"ID":"5b2b268a-adc9-46ca-908a-d30ab8543059","Type":"ContainerDied","Data":"20dab72734e5903795d478bac44970ac57f0690f8f454dee9dae223fbd3e92ac"} Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.206532 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" event={"ID":"5b2b268a-adc9-46ca-908a-d30ab8543059","Type":"ContainerDied","Data":"d82d6336b352da3e8f6c37dd12f8ae1d4a1f5e3d8a23342826dc108286471929"} Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.206544 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d82d6336b352da3e8f6c37dd12f8ae1d4a1f5e3d8a23342826dc108286471929" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.208019 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6afeb6a7-a0a0-40de-90ee-97b497663798" containerName="probe" containerID="cri-o://dd358903bd4bede8bf029bbf11c70e3e47fe50f447bead2b369061bf66fe7262" gracePeriod=30 Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.279310 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.382823 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-sb\") pod \"5b2b268a-adc9-46ca-908a-d30ab8543059\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.383190 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-nb\") pod \"5b2b268a-adc9-46ca-908a-d30ab8543059\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.383215 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-config\") pod \"5b2b268a-adc9-46ca-908a-d30ab8543059\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.383468 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f92p7\" (UniqueName: \"kubernetes.io/projected/5b2b268a-adc9-46ca-908a-d30ab8543059-kube-api-access-f92p7\") pod \"5b2b268a-adc9-46ca-908a-d30ab8543059\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.383492 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-dns-svc\") pod \"5b2b268a-adc9-46ca-908a-d30ab8543059\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.408049 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b2b268a-adc9-46ca-908a-d30ab8543059-kube-api-access-f92p7" (OuterVolumeSpecName: "kube-api-access-f92p7") pod "5b2b268a-adc9-46ca-908a-d30ab8543059" (UID: "5b2b268a-adc9-46ca-908a-d30ab8543059"). InnerVolumeSpecName "kube-api-access-f92p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.438387 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5b2b268a-adc9-46ca-908a-d30ab8543059" (UID: "5b2b268a-adc9-46ca-908a-d30ab8543059"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.438570 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5b2b268a-adc9-46ca-908a-d30ab8543059" (UID: "5b2b268a-adc9-46ca-908a-d30ab8543059"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.444743 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5b2b268a-adc9-46ca-908a-d30ab8543059" (UID: "5b2b268a-adc9-46ca-908a-d30ab8543059"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.465491 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-config" (OuterVolumeSpecName: "config") pod "5b2b268a-adc9-46ca-908a-d30ab8543059" (UID: "5b2b268a-adc9-46ca-908a-d30ab8543059"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.485770 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f92p7\" (UniqueName: \"kubernetes.io/projected/5b2b268a-adc9-46ca-908a-d30ab8543059-kube-api-access-f92p7\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.485805 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.485817 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.485826 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.485834 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:43 crc kubenswrapper[4764]: I0309 13:41:43.214148 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:43 crc kubenswrapper[4764]: I0309 13:41:43.255957 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-sh5rn"] Mar 09 13:41:43 crc kubenswrapper[4764]: I0309 13:41:43.263271 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-sh5rn"] Mar 09 13:41:43 crc kubenswrapper[4764]: I0309 13:41:43.576106 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b2b268a-adc9-46ca-908a-d30ab8543059" path="/var/lib/kubelet/pods/5b2b268a-adc9-46ca-908a-d30ab8543059/volumes" Mar 09 13:41:43 crc kubenswrapper[4764]: I0309 13:41:43.686449 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7dcdc6487b-j8w75" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": dial tcp 10.217.0.153:9311: connect: connection refused" Mar 09 13:41:43 crc kubenswrapper[4764]: I0309 13:41:43.692005 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7dcdc6487b-j8w75" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": dial tcp 10.217.0.153:9311: connect: connection refused" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.077541 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.188406 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data\") pod \"60a250ee-f349-4bab-b5b4-b402289210a6\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.188631 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60a250ee-f349-4bab-b5b4-b402289210a6-logs\") pod \"60a250ee-f349-4bab-b5b4-b402289210a6\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.188789 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-combined-ca-bundle\") pod \"60a250ee-f349-4bab-b5b4-b402289210a6\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.188861 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gstx5\" (UniqueName: \"kubernetes.io/projected/60a250ee-f349-4bab-b5b4-b402289210a6-kube-api-access-gstx5\") pod \"60a250ee-f349-4bab-b5b4-b402289210a6\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.188992 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data-custom\") pod \"60a250ee-f349-4bab-b5b4-b402289210a6\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.190131 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60a250ee-f349-4bab-b5b4-b402289210a6-logs" (OuterVolumeSpecName: "logs") pod "60a250ee-f349-4bab-b5b4-b402289210a6" (UID: "60a250ee-f349-4bab-b5b4-b402289210a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.198420 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "60a250ee-f349-4bab-b5b4-b402289210a6" (UID: "60a250ee-f349-4bab-b5b4-b402289210a6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.198731 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60a250ee-f349-4bab-b5b4-b402289210a6-kube-api-access-gstx5" (OuterVolumeSpecName: "kube-api-access-gstx5") pod "60a250ee-f349-4bab-b5b4-b402289210a6" (UID: "60a250ee-f349-4bab-b5b4-b402289210a6"). InnerVolumeSpecName "kube-api-access-gstx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.227497 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.241881 4764 generic.go:334] "Generic (PLEG): container finished" podID="60a250ee-f349-4bab-b5b4-b402289210a6" containerID="73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5" exitCode=0 Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.242042 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dcdc6487b-j8w75" event={"ID":"60a250ee-f349-4bab-b5b4-b402289210a6","Type":"ContainerDied","Data":"73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5"} Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.242126 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dcdc6487b-j8w75" event={"ID":"60a250ee-f349-4bab-b5b4-b402289210a6","Type":"ContainerDied","Data":"60454879be0f224ed5bb74217d2090bdcd6efc318e5799fb2111033ee5eacbe4"} Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.242193 4764 scope.go:117] "RemoveContainer" containerID="73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.242389 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.246869 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60a250ee-f349-4bab-b5b4-b402289210a6" (UID: "60a250ee-f349-4bab-b5b4-b402289210a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.259042 4764 generic.go:334] "Generic (PLEG): container finished" podID="6afeb6a7-a0a0-40de-90ee-97b497663798" containerID="dd358903bd4bede8bf029bbf11c70e3e47fe50f447bead2b369061bf66fe7262" exitCode=0 Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.259082 4764 generic.go:334] "Generic (PLEG): container finished" podID="6afeb6a7-a0a0-40de-90ee-97b497663798" containerID="3048f7a722fe7b1924f522a8fa95862bc27ca27611f2c6a75277ad4a82b3a7d9" exitCode=0 Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.259110 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6afeb6a7-a0a0-40de-90ee-97b497663798","Type":"ContainerDied","Data":"dd358903bd4bede8bf029bbf11c70e3e47fe50f447bead2b369061bf66fe7262"} Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.259147 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6afeb6a7-a0a0-40de-90ee-97b497663798","Type":"ContainerDied","Data":"3048f7a722fe7b1924f522a8fa95862bc27ca27611f2c6a75277ad4a82b3a7d9"} Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.296685 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60a250ee-f349-4bab-b5b4-b402289210a6-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.296726 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.296833 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gstx5\" (UniqueName: \"kubernetes.io/projected/60a250ee-f349-4bab-b5b4-b402289210a6-kube-api-access-gstx5\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.296850 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.329010 4764 scope.go:117] "RemoveContainer" containerID="4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.350070 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data" (OuterVolumeSpecName: "config-data") pod "60a250ee-f349-4bab-b5b4-b402289210a6" (UID: "60a250ee-f349-4bab-b5b4-b402289210a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.377868 4764 scope.go:117] "RemoveContainer" containerID="73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5" Mar 09 13:41:44 crc kubenswrapper[4764]: E0309 13:41:44.389795 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5\": container with ID starting with 73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5 not found: ID does not exist" containerID="73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.389851 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5"} err="failed to get container status \"73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5\": rpc error: code = NotFound desc = could not find container \"73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5\": container with ID starting with 73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5 not found: ID does not exist" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.389880 4764 scope.go:117] "RemoveContainer" containerID="4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067" Mar 09 13:41:44 crc kubenswrapper[4764]: E0309 13:41:44.390320 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067\": container with ID starting with 4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067 not found: ID does not exist" containerID="4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.390345 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067"} err="failed to get container status \"4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067\": rpc error: code = NotFound desc = could not find container \"4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067\": container with ID starting with 4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067 not found: ID does not exist" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.398624 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.433077 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.512394 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.589657 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7dcdc6487b-j8w75"] Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.605536 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7dcdc6487b-j8w75"] Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.705485 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-combined-ca-bundle\") pod \"6afeb6a7-a0a0-40de-90ee-97b497663798\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.705758 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-scripts\") pod \"6afeb6a7-a0a0-40de-90ee-97b497663798\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.705799 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6afeb6a7-a0a0-40de-90ee-97b497663798-etc-machine-id\") pod \"6afeb6a7-a0a0-40de-90ee-97b497663798\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.705938 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rsb9\" (UniqueName: \"kubernetes.io/projected/6afeb6a7-a0a0-40de-90ee-97b497663798-kube-api-access-5rsb9\") pod \"6afeb6a7-a0a0-40de-90ee-97b497663798\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.706073 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data-custom\") pod \"6afeb6a7-a0a0-40de-90ee-97b497663798\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.706098 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data\") pod \"6afeb6a7-a0a0-40de-90ee-97b497663798\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.706338 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6afeb6a7-a0a0-40de-90ee-97b497663798-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6afeb6a7-a0a0-40de-90ee-97b497663798" (UID: "6afeb6a7-a0a0-40de-90ee-97b497663798"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.706736 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6afeb6a7-a0a0-40de-90ee-97b497663798-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.712726 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-scripts" (OuterVolumeSpecName: "scripts") pod "6afeb6a7-a0a0-40de-90ee-97b497663798" (UID: "6afeb6a7-a0a0-40de-90ee-97b497663798"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.714559 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6afeb6a7-a0a0-40de-90ee-97b497663798-kube-api-access-5rsb9" (OuterVolumeSpecName: "kube-api-access-5rsb9") pod "6afeb6a7-a0a0-40de-90ee-97b497663798" (UID: "6afeb6a7-a0a0-40de-90ee-97b497663798"). InnerVolumeSpecName "kube-api-access-5rsb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.715823 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6afeb6a7-a0a0-40de-90ee-97b497663798" (UID: "6afeb6a7-a0a0-40de-90ee-97b497663798"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.771824 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6afeb6a7-a0a0-40de-90ee-97b497663798" (UID: "6afeb6a7-a0a0-40de-90ee-97b497663798"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.774479 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.812560 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rsb9\" (UniqueName: \"kubernetes.io/projected/6afeb6a7-a0a0-40de-90ee-97b497663798-kube-api-access-5rsb9\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.812598 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.812607 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.812616 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.833608 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data" (OuterVolumeSpecName: "config-data") pod "6afeb6a7-a0a0-40de-90ee-97b497663798" (UID: "6afeb6a7-a0a0-40de-90ee-97b497663798"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.870298 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-586d68b4fd-xj4tk"] Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.870570 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-586d68b4fd-xj4tk" podUID="e89507a7-04a1-444b-b38a-40b001ec079a" containerName="placement-log" containerID="cri-o://846d1c4f6ed0185cda05e320aba446c1f9b2558f2a5da71ef147be109e27b726" gracePeriod=30 Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.870771 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-586d68b4fd-xj4tk" podUID="e89507a7-04a1-444b-b38a-40b001ec079a" containerName="placement-api" containerID="cri-o://6939747e3ae2dc764ad9ad287190d81cffed4e3b8fef6031a7e72a9769a88873" gracePeriod=30 Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.915097 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.271231 4764 generic.go:334] "Generic (PLEG): container finished" podID="e89507a7-04a1-444b-b38a-40b001ec079a" containerID="846d1c4f6ed0185cda05e320aba446c1f9b2558f2a5da71ef147be109e27b726" exitCode=143 Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.271314 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-586d68b4fd-xj4tk" event={"ID":"e89507a7-04a1-444b-b38a-40b001ec079a","Type":"ContainerDied","Data":"846d1c4f6ed0185cda05e320aba446c1f9b2558f2a5da71ef147be109e27b726"} Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.277609 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6afeb6a7-a0a0-40de-90ee-97b497663798","Type":"ContainerDied","Data":"ca431b92c1671db841cf4cedf0d25ffe1db3212b3b5515413a5c717be63cab9f"} Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.277768 4764 scope.go:117] "RemoveContainer" containerID="dd358903bd4bede8bf029bbf11c70e3e47fe50f447bead2b369061bf66fe7262" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.277777 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.306666 4764 scope.go:117] "RemoveContainer" containerID="3048f7a722fe7b1924f522a8fa95862bc27ca27611f2c6a75277ad4a82b3a7d9" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.315449 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.344761 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.352552 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:41:45 crc kubenswrapper[4764]: E0309 13:41:45.353092 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerName="neutron-httpd" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353117 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerName="neutron-httpd" Mar 09 13:41:45 crc kubenswrapper[4764]: E0309 13:41:45.353131 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b2b268a-adc9-46ca-908a-d30ab8543059" containerName="init" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353138 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b2b268a-adc9-46ca-908a-d30ab8543059" containerName="init" Mar 09 13:41:45 crc kubenswrapper[4764]: E0309 13:41:45.353151 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6afeb6a7-a0a0-40de-90ee-97b497663798" containerName="cinder-scheduler" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353158 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6afeb6a7-a0a0-40de-90ee-97b497663798" containerName="cinder-scheduler" Mar 09 13:41:45 crc kubenswrapper[4764]: E0309 13:41:45.353173 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6afeb6a7-a0a0-40de-90ee-97b497663798" containerName="probe" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353179 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6afeb6a7-a0a0-40de-90ee-97b497663798" containerName="probe" Mar 09 13:41:45 crc kubenswrapper[4764]: E0309 13:41:45.353189 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b2b268a-adc9-46ca-908a-d30ab8543059" containerName="dnsmasq-dns" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353195 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b2b268a-adc9-46ca-908a-d30ab8543059" containerName="dnsmasq-dns" Mar 09 13:41:45 crc kubenswrapper[4764]: E0309 13:41:45.353205 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerName="neutron-api" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353211 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerName="neutron-api" Mar 09 13:41:45 crc kubenswrapper[4764]: E0309 13:41:45.353220 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353226 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api" Mar 09 13:41:45 crc kubenswrapper[4764]: E0309 13:41:45.353245 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api-log" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353252 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api-log" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353450 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b2b268a-adc9-46ca-908a-d30ab8543059" containerName="dnsmasq-dns" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353475 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerName="neutron-httpd" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353486 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353510 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerName="neutron-api" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353527 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api-log" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353535 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6afeb6a7-a0a0-40de-90ee-97b497663798" containerName="cinder-scheduler" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353543 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6afeb6a7-a0a0-40de-90ee-97b497663798" containerName="probe" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.354523 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.358771 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.413061 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.527568 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-config-data\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.527634 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/05d58314-31c8-4b6a-8c8c-1dc211d9f424-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.527694 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.527720 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.528124 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtrq2\" (UniqueName: \"kubernetes.io/projected/05d58314-31c8-4b6a-8c8c-1dc211d9f424-kube-api-access-qtrq2\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.528479 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-scripts\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.579840 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" path="/var/lib/kubelet/pods/60a250ee-f349-4bab-b5b4-b402289210a6/volumes" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.580481 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6afeb6a7-a0a0-40de-90ee-97b497663798" path="/var/lib/kubelet/pods/6afeb6a7-a0a0-40de-90ee-97b497663798/volumes" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.631000 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-config-data\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.631810 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/05d58314-31c8-4b6a-8c8c-1dc211d9f424-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.631870 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.631901 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.631958 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtrq2\" (UniqueName: \"kubernetes.io/projected/05d58314-31c8-4b6a-8c8c-1dc211d9f424-kube-api-access-qtrq2\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.632027 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/05d58314-31c8-4b6a-8c8c-1dc211d9f424-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.632040 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-scripts\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.636153 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.638898 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-scripts\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.639178 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.639413 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-config-data\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.650219 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.650684 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtrq2\" (UniqueName: \"kubernetes.io/projected/05d58314-31c8-4b6a-8c8c-1dc211d9f424-kube-api-access-qtrq2\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.722101 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 13:41:46 crc kubenswrapper[4764]: I0309 13:41:46.323940 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:41:47 crc kubenswrapper[4764]: I0309 13:41:47.314699 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"05d58314-31c8-4b6a-8c8c-1dc211d9f424","Type":"ContainerStarted","Data":"280763f18ffca699716abeb32eb578440349294752b3cb7f97f829f5b450fa17"} Mar 09 13:41:47 crc kubenswrapper[4764]: I0309 13:41:47.315345 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"05d58314-31c8-4b6a-8c8c-1dc211d9f424","Type":"ContainerStarted","Data":"15ea221d2905458f73538d3f2a43c8a79b9c820ae64821655ca5ba44c22d5dbd"} Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.327005 4764 generic.go:334] "Generic (PLEG): container finished" podID="e89507a7-04a1-444b-b38a-40b001ec079a" containerID="6939747e3ae2dc764ad9ad287190d81cffed4e3b8fef6031a7e72a9769a88873" exitCode=0 Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.327469 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-586d68b4fd-xj4tk" event={"ID":"e89507a7-04a1-444b-b38a-40b001ec079a","Type":"ContainerDied","Data":"6939747e3ae2dc764ad9ad287190d81cffed4e3b8fef6031a7e72a9769a88873"} Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.334056 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"05d58314-31c8-4b6a-8c8c-1dc211d9f424","Type":"ContainerStarted","Data":"749a569e9b8139758a9781531307bf7e1c326dcee90021ec2c57d32ca51f6804"} Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.542281 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.579624 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.579592025 podStartE2EDuration="3.579592025s" podCreationTimestamp="2026-03-09 13:41:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:48.360923184 +0000 UTC m=+1263.611095092" watchObservedRunningTime="2026-03-09 13:41:48.579592025 +0000 UTC m=+1263.829763933" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.629748 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-combined-ca-bundle\") pod \"e89507a7-04a1-444b-b38a-40b001ec079a\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.629831 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-scripts\") pod \"e89507a7-04a1-444b-b38a-40b001ec079a\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.629877 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-public-tls-certs\") pod \"e89507a7-04a1-444b-b38a-40b001ec079a\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.629950 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8mnp\" (UniqueName: \"kubernetes.io/projected/e89507a7-04a1-444b-b38a-40b001ec079a-kube-api-access-j8mnp\") pod \"e89507a7-04a1-444b-b38a-40b001ec079a\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.630007 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e89507a7-04a1-444b-b38a-40b001ec079a-logs\") pod \"e89507a7-04a1-444b-b38a-40b001ec079a\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.630091 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-config-data\") pod \"e89507a7-04a1-444b-b38a-40b001ec079a\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.630118 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-internal-tls-certs\") pod \"e89507a7-04a1-444b-b38a-40b001ec079a\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.634175 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e89507a7-04a1-444b-b38a-40b001ec079a-logs" (OuterVolumeSpecName: "logs") pod "e89507a7-04a1-444b-b38a-40b001ec079a" (UID: "e89507a7-04a1-444b-b38a-40b001ec079a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.659278 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e89507a7-04a1-444b-b38a-40b001ec079a-kube-api-access-j8mnp" (OuterVolumeSpecName: "kube-api-access-j8mnp") pod "e89507a7-04a1-444b-b38a-40b001ec079a" (UID: "e89507a7-04a1-444b-b38a-40b001ec079a"). InnerVolumeSpecName "kube-api-access-j8mnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.659501 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-scripts" (OuterVolumeSpecName: "scripts") pod "e89507a7-04a1-444b-b38a-40b001ec079a" (UID: "e89507a7-04a1-444b-b38a-40b001ec079a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.733170 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.733217 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8mnp\" (UniqueName: \"kubernetes.io/projected/e89507a7-04a1-444b-b38a-40b001ec079a-kube-api-access-j8mnp\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.733232 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e89507a7-04a1-444b-b38a-40b001ec079a-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.756042 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e89507a7-04a1-444b-b38a-40b001ec079a" (UID: "e89507a7-04a1-444b-b38a-40b001ec079a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.763827 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-config-data" (OuterVolumeSpecName: "config-data") pod "e89507a7-04a1-444b-b38a-40b001ec079a" (UID: "e89507a7-04a1-444b-b38a-40b001ec079a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.799547 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e89507a7-04a1-444b-b38a-40b001ec079a" (UID: "e89507a7-04a1-444b-b38a-40b001ec079a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.803844 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e89507a7-04a1-444b-b38a-40b001ec079a" (UID: "e89507a7-04a1-444b-b38a-40b001ec079a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.835212 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.835262 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.835277 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.835289 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.270376 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 09 13:41:49 crc kubenswrapper[4764]: E0309 13:41:49.271310 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89507a7-04a1-444b-b38a-40b001ec079a" containerName="placement-log" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.271332 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89507a7-04a1-444b-b38a-40b001ec079a" containerName="placement-log" Mar 09 13:41:49 crc kubenswrapper[4764]: E0309 13:41:49.271347 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89507a7-04a1-444b-b38a-40b001ec079a" containerName="placement-api" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.271356 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89507a7-04a1-444b-b38a-40b001ec079a" containerName="placement-api" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.271593 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89507a7-04a1-444b-b38a-40b001ec079a" containerName="placement-api" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.271630 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89507a7-04a1-444b-b38a-40b001ec079a" containerName="placement-log" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.272356 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.278771 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.278770 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.279154 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-p9c7c" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.294451 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.356033 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d82ed357-9f4c-478b-b893-ab6ff10fc83c-openstack-config-secret\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.356098 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d82ed357-9f4c-478b-b893-ab6ff10fc83c-openstack-config\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.356169 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82ed357-9f4c-478b-b893-ab6ff10fc83c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.356281 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v458d\" (UniqueName: \"kubernetes.io/projected/d82ed357-9f4c-478b-b893-ab6ff10fc83c-kube-api-access-v458d\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.384746 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.384739 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-586d68b4fd-xj4tk" event={"ID":"e89507a7-04a1-444b-b38a-40b001ec079a","Type":"ContainerDied","Data":"521edccf936d07531fd9f618513cede620e92cebf6edb91c3eb8e6d1d0940b40"} Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.384850 4764 scope.go:117] "RemoveContainer" containerID="6939747e3ae2dc764ad9ad287190d81cffed4e3b8fef6031a7e72a9769a88873" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.457854 4764 scope.go:117] "RemoveContainer" containerID="846d1c4f6ed0185cda05e320aba446c1f9b2558f2a5da71ef147be109e27b726" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.460328 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d82ed357-9f4c-478b-b893-ab6ff10fc83c-openstack-config-secret\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.460392 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d82ed357-9f4c-478b-b893-ab6ff10fc83c-openstack-config\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.460454 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82ed357-9f4c-478b-b893-ab6ff10fc83c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.460527 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v458d\" (UniqueName: \"kubernetes.io/projected/d82ed357-9f4c-478b-b893-ab6ff10fc83c-kube-api-access-v458d\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.463548 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d82ed357-9f4c-478b-b893-ab6ff10fc83c-openstack-config\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.483699 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-586d68b4fd-xj4tk"] Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.485684 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d82ed357-9f4c-478b-b893-ab6ff10fc83c-openstack-config-secret\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.489303 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82ed357-9f4c-478b-b893-ab6ff10fc83c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.490699 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v458d\" (UniqueName: \"kubernetes.io/projected/d82ed357-9f4c-478b-b893-ab6ff10fc83c-kube-api-access-v458d\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.518580 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-586d68b4fd-xj4tk"] Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.591899 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.604457 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e89507a7-04a1-444b-b38a-40b001ec079a" path="/var/lib/kubelet/pods/e89507a7-04a1-444b-b38a-40b001ec079a/volumes" Mar 09 13:41:50 crc kubenswrapper[4764]: I0309 13:41:50.121946 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 09 13:41:50 crc kubenswrapper[4764]: W0309 13:41:50.124296 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd82ed357_9f4c_478b_b893_ab6ff10fc83c.slice/crio-b37961ab13817fdc9aaa7d3640d627039046c5872124cc4eb3d499e3dd38bec3 WatchSource:0}: Error finding container b37961ab13817fdc9aaa7d3640d627039046c5872124cc4eb3d499e3dd38bec3: Status 404 returned error can't find the container with id b37961ab13817fdc9aaa7d3640d627039046c5872124cc4eb3d499e3dd38bec3 Mar 09 13:41:50 crc kubenswrapper[4764]: I0309 13:41:50.321328 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 09 13:41:50 crc kubenswrapper[4764]: I0309 13:41:50.398486 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d82ed357-9f4c-478b-b893-ab6ff10fc83c","Type":"ContainerStarted","Data":"b37961ab13817fdc9aaa7d3640d627039046c5872124cc4eb3d499e3dd38bec3"} Mar 09 13:41:50 crc kubenswrapper[4764]: I0309 13:41:50.722785 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 09 13:41:55 crc kubenswrapper[4764]: I0309 13:41:55.999041 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 09 13:41:57 crc kubenswrapper[4764]: I0309 13:41:57.319333 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 13:41:58 crc kubenswrapper[4764]: I0309 13:41:58.348795 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:41:58 crc kubenswrapper[4764]: I0309 13:41:58.349233 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="proxy-httpd" containerID="cri-o://8e00a9b64763a83494a83ac4067c14a67cca49b0a2f0a325e393f9400b39892b" gracePeriod=30 Mar 09 13:41:58 crc kubenswrapper[4764]: I0309 13:41:58.349306 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="ceilometer-notification-agent" containerID="cri-o://1e8bef222d882b21a7ec25aed9429545abfc2c7a862b52718f01f100ae12b17d" gracePeriod=30 Mar 09 13:41:58 crc kubenswrapper[4764]: I0309 13:41:58.349398 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="sg-core" containerID="cri-o://4fe2306e68853bfa3180ad2e5295480a2103723960a9860736e8ad880fc1db61" gracePeriod=30 Mar 09 13:41:58 crc kubenswrapper[4764]: I0309 13:41:58.349506 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="ceilometer-central-agent" containerID="cri-o://d86adcecb94ef745afb424352583b984c64f486d6d2dfabedad259db0634d772" gracePeriod=30 Mar 09 13:41:58 crc kubenswrapper[4764]: I0309 13:41:58.494114 4764 generic.go:334] "Generic (PLEG): container finished" podID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerID="4fe2306e68853bfa3180ad2e5295480a2103723960a9860736e8ad880fc1db61" exitCode=2 Mar 09 13:41:58 crc kubenswrapper[4764]: I0309 13:41:58.494208 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3","Type":"ContainerDied","Data":"4fe2306e68853bfa3180ad2e5295480a2103723960a9860736e8ad880fc1db61"} Mar 09 13:41:59 crc kubenswrapper[4764]: I0309 13:41:59.529667 4764 generic.go:334] "Generic (PLEG): container finished" podID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerID="8e00a9b64763a83494a83ac4067c14a67cca49b0a2f0a325e393f9400b39892b" exitCode=0 Mar 09 13:41:59 crc kubenswrapper[4764]: I0309 13:41:59.529710 4764 generic.go:334] "Generic (PLEG): container finished" podID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerID="1e8bef222d882b21a7ec25aed9429545abfc2c7a862b52718f01f100ae12b17d" exitCode=0 Mar 09 13:41:59 crc kubenswrapper[4764]: I0309 13:41:59.529723 4764 generic.go:334] "Generic (PLEG): container finished" podID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerID="d86adcecb94ef745afb424352583b984c64f486d6d2dfabedad259db0634d772" exitCode=0 Mar 09 13:41:59 crc kubenswrapper[4764]: I0309 13:41:59.529750 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3","Type":"ContainerDied","Data":"8e00a9b64763a83494a83ac4067c14a67cca49b0a2f0a325e393f9400b39892b"} Mar 09 13:41:59 crc kubenswrapper[4764]: I0309 13:41:59.529788 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3","Type":"ContainerDied","Data":"1e8bef222d882b21a7ec25aed9429545abfc2c7a862b52718f01f100ae12b17d"} Mar 09 13:41:59 crc kubenswrapper[4764]: I0309 13:41:59.529805 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3","Type":"ContainerDied","Data":"d86adcecb94ef745afb424352583b984c64f486d6d2dfabedad259db0634d772"} Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.145717 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551062-wl7gf"] Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.155011 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551062-wl7gf" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.160299 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.160321 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.160563 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.164111 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551062-wl7gf"] Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.258147 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mcfc\" (UniqueName: \"kubernetes.io/projected/61f41fcc-bb74-4c90-a6af-bfcd168ef2cb-kube-api-access-7mcfc\") pod \"auto-csr-approver-29551062-wl7gf\" (UID: \"61f41fcc-bb74-4c90-a6af-bfcd168ef2cb\") " pod="openshift-infra/auto-csr-approver-29551062-wl7gf" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.360600 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mcfc\" (UniqueName: \"kubernetes.io/projected/61f41fcc-bb74-4c90-a6af-bfcd168ef2cb-kube-api-access-7mcfc\") pod \"auto-csr-approver-29551062-wl7gf\" (UID: \"61f41fcc-bb74-4c90-a6af-bfcd168ef2cb\") " pod="openshift-infra/auto-csr-approver-29551062-wl7gf" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.399042 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mcfc\" (UniqueName: \"kubernetes.io/projected/61f41fcc-bb74-4c90-a6af-bfcd168ef2cb-kube-api-access-7mcfc\") pod \"auto-csr-approver-29551062-wl7gf\" (UID: \"61f41fcc-bb74-4c90-a6af-bfcd168ef2cb\") " pod="openshift-infra/auto-csr-approver-29551062-wl7gf" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.478311 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551062-wl7gf" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.729782 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.769300 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-scripts\") pod \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.769362 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-config-data\") pod \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.769466 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpxx4\" (UniqueName: \"kubernetes.io/projected/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-kube-api-access-qpxx4\") pod \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.769512 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-run-httpd\") pod \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.769551 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-log-httpd\") pod \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.769718 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-sg-core-conf-yaml\") pod \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.769742 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-combined-ca-bundle\") pod \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.772531 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" (UID: "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.773668 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" (UID: "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.776664 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-kube-api-access-qpxx4" (OuterVolumeSpecName: "kube-api-access-qpxx4") pod "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" (UID: "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3"). InnerVolumeSpecName "kube-api-access-qpxx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.777061 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-scripts" (OuterVolumeSpecName: "scripts") pod "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" (UID: "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.802120 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" (UID: "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.870719 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" (UID: "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.871950 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.871976 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.871988 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.872000 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpxx4\" (UniqueName: \"kubernetes.io/projected/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-kube-api-access-qpxx4\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.872011 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.872019 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.910534 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-config-data" (OuterVolumeSpecName: "config-data") pod "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" (UID: "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.973487 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.036168 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551062-wl7gf"] Mar 09 13:42:01 crc kubenswrapper[4764]: W0309 13:42:01.042992 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61f41fcc_bb74_4c90_a6af_bfcd168ef2cb.slice/crio-eb63554a3776e0f69aa0f087de4b9957a83350b60020c393086a1cd637f2cf8d WatchSource:0}: Error finding container eb63554a3776e0f69aa0f087de4b9957a83350b60020c393086a1cd637f2cf8d: Status 404 returned error can't find the container with id eb63554a3776e0f69aa0f087de4b9957a83350b60020c393086a1cd637f2cf8d Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.549810 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d82ed357-9f4c-478b-b893-ab6ff10fc83c","Type":"ContainerStarted","Data":"37dc3e9342a9a8a18b3bbe6377f05fd5914b7ef1ca66c80903cc720041e16747"} Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.551357 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551062-wl7gf" event={"ID":"61f41fcc-bb74-4c90-a6af-bfcd168ef2cb","Type":"ContainerStarted","Data":"eb63554a3776e0f69aa0f087de4b9957a83350b60020c393086a1cd637f2cf8d"} Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.556066 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3","Type":"ContainerDied","Data":"a9199722fbed2e4a6e2a95c99d206366b5d4467326eba28ca7c133400c787e70"} Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.556127 4764 scope.go:117] "RemoveContainer" containerID="8e00a9b64763a83494a83ac4067c14a67cca49b0a2f0a325e393f9400b39892b" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.556333 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.593033 4764 scope.go:117] "RemoveContainer" containerID="4fe2306e68853bfa3180ad2e5295480a2103723960a9860736e8ad880fc1db61" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.593508 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.411367263 podStartE2EDuration="12.593477667s" podCreationTimestamp="2026-03-09 13:41:49 +0000 UTC" firstStartedPulling="2026-03-09 13:41:50.126950229 +0000 UTC m=+1265.377122137" lastFinishedPulling="2026-03-09 13:42:00.309060643 +0000 UTC m=+1275.559232541" observedRunningTime="2026-03-09 13:42:01.56807716 +0000 UTC m=+1276.818249078" watchObservedRunningTime="2026-03-09 13:42:01.593477667 +0000 UTC m=+1276.843649575" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.645056 4764 scope.go:117] "RemoveContainer" containerID="1e8bef222d882b21a7ec25aed9429545abfc2c7a862b52718f01f100ae12b17d" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.646293 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.661920 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.673202 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:01 crc kubenswrapper[4764]: E0309 13:42:01.673627 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="ceilometer-central-agent" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.673664 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="ceilometer-central-agent" Mar 09 13:42:01 crc kubenswrapper[4764]: E0309 13:42:01.673693 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="sg-core" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.673699 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="sg-core" Mar 09 13:42:01 crc kubenswrapper[4764]: E0309 13:42:01.673707 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="proxy-httpd" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.673713 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="proxy-httpd" Mar 09 13:42:01 crc kubenswrapper[4764]: E0309 13:42:01.673723 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="ceilometer-notification-agent" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.673729 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="ceilometer-notification-agent" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.673880 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="ceilometer-notification-agent" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.673897 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="ceilometer-central-agent" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.673909 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="proxy-httpd" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.673920 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="sg-core" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.675595 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.682029 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.688662 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.688692 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.703137 4764 scope.go:117] "RemoveContainer" containerID="d86adcecb94ef745afb424352583b984c64f486d6d2dfabedad259db0634d772" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.709827 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-log-httpd\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.709871 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-scripts\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.709900 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-config-data\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.709917 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-run-httpd\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.709951 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.709991 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.710052 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x677c\" (UniqueName: \"kubernetes.io/projected/4cafd43e-a12e-46ee-8108-8e33d10c47ee-kube-api-access-x677c\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.812554 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-log-httpd\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.812660 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-scripts\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.812722 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-config-data\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.812757 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-run-httpd\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.813292 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-log-httpd\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.813353 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-run-httpd\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.813413 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.813554 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.813638 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x677c\" (UniqueName: \"kubernetes.io/projected/4cafd43e-a12e-46ee-8108-8e33d10c47ee-kube-api-access-x677c\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.819777 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-scripts\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.824052 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.824993 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.831416 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-config-data\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.838105 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x677c\" (UniqueName: \"kubernetes.io/projected/4cafd43e-a12e-46ee-8108-8e33d10c47ee-kube-api-access-x677c\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.933640 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.934724 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.967956 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.968230 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="ce660994-4427-4d54-b83c-9c9ec7f64a9d" containerName="kube-state-metrics" containerID="cri-o://9847def4972a082de5de5ce66af1aea1d5c2f1147d6cb0e73ffadb728f7879a5" gracePeriod=30 Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.505271 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:02 crc kubenswrapper[4764]: W0309 13:42:02.511833 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cafd43e_a12e_46ee_8108_8e33d10c47ee.slice/crio-5953a3b710032064c75479dd87c9f1404c8431fcc42a83d5bac920c5b52ac282 WatchSource:0}: Error finding container 5953a3b710032064c75479dd87c9f1404c8431fcc42a83d5bac920c5b52ac282: Status 404 returned error can't find the container with id 5953a3b710032064c75479dd87c9f1404c8431fcc42a83d5bac920c5b52ac282 Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.572452 4764 generic.go:334] "Generic (PLEG): container finished" podID="ce660994-4427-4d54-b83c-9c9ec7f64a9d" containerID="9847def4972a082de5de5ce66af1aea1d5c2f1147d6cb0e73ffadb728f7879a5" exitCode=2 Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.572550 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ce660994-4427-4d54-b83c-9c9ec7f64a9d","Type":"ContainerDied","Data":"9847def4972a082de5de5ce66af1aea1d5c2f1147d6cb0e73ffadb728f7879a5"} Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.572603 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ce660994-4427-4d54-b83c-9c9ec7f64a9d","Type":"ContainerDied","Data":"b8a8570b31f5838a5c44b32bc16eef1004cf79cfc6a6ee8f31255abe6b100221"} Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.572618 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8a8570b31f5838a5c44b32bc16eef1004cf79cfc6a6ee8f31255abe6b100221" Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.576494 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551062-wl7gf" event={"ID":"61f41fcc-bb74-4c90-a6af-bfcd168ef2cb","Type":"ContainerStarted","Data":"9f52cd378552f4424230b7014d94477e188c2eb48ce843f7df141a1298245bd6"} Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.582969 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cafd43e-a12e-46ee-8108-8e33d10c47ee","Type":"ContainerStarted","Data":"5953a3b710032064c75479dd87c9f1404c8431fcc42a83d5bac920c5b52ac282"} Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.583511 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.604065 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551062-wl7gf" podStartSLOduration=1.64226788 podStartE2EDuration="2.604040657s" podCreationTimestamp="2026-03-09 13:42:00 +0000 UTC" firstStartedPulling="2026-03-09 13:42:01.045187184 +0000 UTC m=+1276.295359092" lastFinishedPulling="2026-03-09 13:42:02.006959971 +0000 UTC m=+1277.257131869" observedRunningTime="2026-03-09 13:42:02.597181515 +0000 UTC m=+1277.847353423" watchObservedRunningTime="2026-03-09 13:42:02.604040657 +0000 UTC m=+1277.854212565" Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.632224 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gphmn\" (UniqueName: \"kubernetes.io/projected/ce660994-4427-4d54-b83c-9c9ec7f64a9d-kube-api-access-gphmn\") pod \"ce660994-4427-4d54-b83c-9c9ec7f64a9d\" (UID: \"ce660994-4427-4d54-b83c-9c9ec7f64a9d\") " Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.643909 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce660994-4427-4d54-b83c-9c9ec7f64a9d-kube-api-access-gphmn" (OuterVolumeSpecName: "kube-api-access-gphmn") pod "ce660994-4427-4d54-b83c-9c9ec7f64a9d" (UID: "ce660994-4427-4d54-b83c-9c9ec7f64a9d"). InnerVolumeSpecName "kube-api-access-gphmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.735285 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gphmn\" (UniqueName: \"kubernetes.io/projected/ce660994-4427-4d54-b83c-9c9ec7f64a9d-kube-api-access-gphmn\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.573394 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" path="/var/lib/kubelet/pods/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3/volumes" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.608670 4764 generic.go:334] "Generic (PLEG): container finished" podID="61f41fcc-bb74-4c90-a6af-bfcd168ef2cb" containerID="9f52cd378552f4424230b7014d94477e188c2eb48ce843f7df141a1298245bd6" exitCode=0 Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.609000 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551062-wl7gf" event={"ID":"61f41fcc-bb74-4c90-a6af-bfcd168ef2cb","Type":"ContainerDied","Data":"9f52cd378552f4424230b7014d94477e188c2eb48ce843f7df141a1298245bd6"} Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.613472 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cafd43e-a12e-46ee-8108-8e33d10c47ee","Type":"ContainerStarted","Data":"8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2"} Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.613493 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.654474 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.668684 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.725897 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:42:03 crc kubenswrapper[4764]: E0309 13:42:03.727388 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce660994-4427-4d54-b83c-9c9ec7f64a9d" containerName="kube-state-metrics" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.727407 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce660994-4427-4d54-b83c-9c9ec7f64a9d" containerName="kube-state-metrics" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.727834 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce660994-4427-4d54-b83c-9c9ec7f64a9d" containerName="kube-state-metrics" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.728801 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.738377 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.738729 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.754524 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.764840 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/179736ec-4215-4ad8-9800-a186978a767f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.765726 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/179736ec-4215-4ad8-9800-a186978a767f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.766187 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgtww\" (UniqueName: \"kubernetes.io/projected/179736ec-4215-4ad8-9800-a186978a767f-kube-api-access-bgtww\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.766278 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179736ec-4215-4ad8-9800-a186978a767f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.868081 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179736ec-4215-4ad8-9800-a186978a767f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.868204 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/179736ec-4215-4ad8-9800-a186978a767f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.868304 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/179736ec-4215-4ad8-9800-a186978a767f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.868346 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgtww\" (UniqueName: \"kubernetes.io/projected/179736ec-4215-4ad8-9800-a186978a767f-kube-api-access-bgtww\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.873712 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/179736ec-4215-4ad8-9800-a186978a767f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.874443 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/179736ec-4215-4ad8-9800-a186978a767f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.874458 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179736ec-4215-4ad8-9800-a186978a767f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.888511 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgtww\" (UniqueName: \"kubernetes.io/projected/179736ec-4215-4ad8-9800-a186978a767f-kube-api-access-bgtww\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.974693 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 13:42:04 crc kubenswrapper[4764]: I0309 13:42:04.430290 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:42:04 crc kubenswrapper[4764]: W0309 13:42:04.438024 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod179736ec_4215_4ad8_9800_a186978a767f.slice/crio-bea433135fb4cb0d388ba0f7dce87d81fbded59498da059f958f5ba30c0c4844 WatchSource:0}: Error finding container bea433135fb4cb0d388ba0f7dce87d81fbded59498da059f958f5ba30c0c4844: Status 404 returned error can't find the container with id bea433135fb4cb0d388ba0f7dce87d81fbded59498da059f958f5ba30c0c4844 Mar 09 13:42:04 crc kubenswrapper[4764]: I0309 13:42:04.671912 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"179736ec-4215-4ad8-9800-a186978a767f","Type":"ContainerStarted","Data":"bea433135fb4cb0d388ba0f7dce87d81fbded59498da059f958f5ba30c0c4844"} Mar 09 13:42:04 crc kubenswrapper[4764]: I0309 13:42:04.694769 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cafd43e-a12e-46ee-8108-8e33d10c47ee","Type":"ContainerStarted","Data":"77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76"} Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.063206 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551062-wl7gf" Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.101879 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mcfc\" (UniqueName: \"kubernetes.io/projected/61f41fcc-bb74-4c90-a6af-bfcd168ef2cb-kube-api-access-7mcfc\") pod \"61f41fcc-bb74-4c90-a6af-bfcd168ef2cb\" (UID: \"61f41fcc-bb74-4c90-a6af-bfcd168ef2cb\") " Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.108923 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61f41fcc-bb74-4c90-a6af-bfcd168ef2cb-kube-api-access-7mcfc" (OuterVolumeSpecName: "kube-api-access-7mcfc") pod "61f41fcc-bb74-4c90-a6af-bfcd168ef2cb" (UID: "61f41fcc-bb74-4c90-a6af-bfcd168ef2cb"). InnerVolumeSpecName "kube-api-access-7mcfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.204401 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mcfc\" (UniqueName: \"kubernetes.io/projected/61f41fcc-bb74-4c90-a6af-bfcd168ef2cb-kube-api-access-7mcfc\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.580163 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce660994-4427-4d54-b83c-9c9ec7f64a9d" path="/var/lib/kubelet/pods/ce660994-4427-4d54-b83c-9c9ec7f64a9d/volumes" Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.677178 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551056-chr2q"] Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.685552 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551056-chr2q"] Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.709076 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cafd43e-a12e-46ee-8108-8e33d10c47ee","Type":"ContainerStarted","Data":"ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9"} Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.717930 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551062-wl7gf" Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.717923 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551062-wl7gf" event={"ID":"61f41fcc-bb74-4c90-a6af-bfcd168ef2cb","Type":"ContainerDied","Data":"eb63554a3776e0f69aa0f087de4b9957a83350b60020c393086a1cd637f2cf8d"} Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.718138 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb63554a3776e0f69aa0f087de4b9957a83350b60020c393086a1cd637f2cf8d" Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.726463 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"179736ec-4215-4ad8-9800-a186978a767f","Type":"ContainerStarted","Data":"081d3f8cf76e67f42fde0999650f46ab737b778f55da60e9d9086f5add2b2601"} Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.726725 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.756596 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.346859035 podStartE2EDuration="2.756563224s" podCreationTimestamp="2026-03-09 13:42:03 +0000 UTC" firstStartedPulling="2026-03-09 13:42:04.442127688 +0000 UTC m=+1279.692299596" lastFinishedPulling="2026-03-09 13:42:04.851831877 +0000 UTC m=+1280.102003785" observedRunningTime="2026-03-09 13:42:05.746385429 +0000 UTC m=+1280.996557357" watchObservedRunningTime="2026-03-09 13:42:05.756563224 +0000 UTC m=+1281.006735172" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.085600 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-tchwl"] Mar 09 13:42:06 crc kubenswrapper[4764]: E0309 13:42:06.086420 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f41fcc-bb74-4c90-a6af-bfcd168ef2cb" containerName="oc" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.086496 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f41fcc-bb74-4c90-a6af-bfcd168ef2cb" containerName="oc" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.086771 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f41fcc-bb74-4c90-a6af-bfcd168ef2cb" containerName="oc" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.087606 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tchwl" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.110662 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tchwl"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.127584 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwkjb\" (UniqueName: \"kubernetes.io/projected/66003ca3-e579-4dab-b714-b5b2baa26bad-kube-api-access-dwkjb\") pod \"nova-api-db-create-tchwl\" (UID: \"66003ca3-e579-4dab-b714-b5b2baa26bad\") " pod="openstack/nova-api-db-create-tchwl" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.127681 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66003ca3-e579-4dab-b714-b5b2baa26bad-operator-scripts\") pod \"nova-api-db-create-tchwl\" (UID: \"66003ca3-e579-4dab-b714-b5b2baa26bad\") " pod="openstack/nova-api-db-create-tchwl" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.147809 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.180341 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-mnqg7"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.181747 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mnqg7" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.229244 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwkjb\" (UniqueName: \"kubernetes.io/projected/66003ca3-e579-4dab-b714-b5b2baa26bad-kube-api-access-dwkjb\") pod \"nova-api-db-create-tchwl\" (UID: \"66003ca3-e579-4dab-b714-b5b2baa26bad\") " pod="openstack/nova-api-db-create-tchwl" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.229352 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66003ca3-e579-4dab-b714-b5b2baa26bad-operator-scripts\") pod \"nova-api-db-create-tchwl\" (UID: \"66003ca3-e579-4dab-b714-b5b2baa26bad\") " pod="openstack/nova-api-db-create-tchwl" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.229559 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-operator-scripts\") pod \"nova-cell0-db-create-mnqg7\" (UID: \"a75ea85a-1e66-4e8d-92d7-6f9b766abfda\") " pod="openstack/nova-cell0-db-create-mnqg7" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.229609 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2ffg\" (UniqueName: \"kubernetes.io/projected/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-kube-api-access-t2ffg\") pod \"nova-cell0-db-create-mnqg7\" (UID: \"a75ea85a-1e66-4e8d-92d7-6f9b766abfda\") " pod="openstack/nova-cell0-db-create-mnqg7" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.232450 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66003ca3-e579-4dab-b714-b5b2baa26bad-operator-scripts\") pod \"nova-api-db-create-tchwl\" (UID: \"66003ca3-e579-4dab-b714-b5b2baa26bad\") " pod="openstack/nova-api-db-create-tchwl" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.261048 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mnqg7"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.286429 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwkjb\" (UniqueName: \"kubernetes.io/projected/66003ca3-e579-4dab-b714-b5b2baa26bad-kube-api-access-dwkjb\") pod \"nova-api-db-create-tchwl\" (UID: \"66003ca3-e579-4dab-b714-b5b2baa26bad\") " pod="openstack/nova-api-db-create-tchwl" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.341433 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-operator-scripts\") pod \"nova-cell0-db-create-mnqg7\" (UID: \"a75ea85a-1e66-4e8d-92d7-6f9b766abfda\") " pod="openstack/nova-cell0-db-create-mnqg7" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.341954 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2ffg\" (UniqueName: \"kubernetes.io/projected/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-kube-api-access-t2ffg\") pod \"nova-cell0-db-create-mnqg7\" (UID: \"a75ea85a-1e66-4e8d-92d7-6f9b766abfda\") " pod="openstack/nova-cell0-db-create-mnqg7" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.343323 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-operator-scripts\") pod \"nova-cell0-db-create-mnqg7\" (UID: \"a75ea85a-1e66-4e8d-92d7-6f9b766abfda\") " pod="openstack/nova-cell0-db-create-mnqg7" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.367901 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2ffg\" (UniqueName: \"kubernetes.io/projected/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-kube-api-access-t2ffg\") pod \"nova-cell0-db-create-mnqg7\" (UID: \"a75ea85a-1e66-4e8d-92d7-6f9b766abfda\") " pod="openstack/nova-cell0-db-create-mnqg7" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.381516 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6b74bc6bc6-vsxl5"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.382282 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6b74bc6bc6-vsxl5" podUID="50610296-d076-4c9f-ac34-a976202ce135" containerName="neutron-httpd" containerID="cri-o://40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9" gracePeriod=30 Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.382519 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6b74bc6bc6-vsxl5" podUID="50610296-d076-4c9f-ac34-a976202ce135" containerName="neutron-api" containerID="cri-o://508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed" gracePeriod=30 Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.406539 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tchwl" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.407353 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-78eb-account-create-update-2dqgt"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.409241 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-78eb-account-create-update-2dqgt" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.417856 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-tbf9j"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.418738 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.420313 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tbf9j" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.439074 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-78eb-account-create-update-2dqgt"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.468149 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tbf9j"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.503971 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mnqg7" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.517046 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cc98-account-create-update-sjfqm"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.522073 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.524890 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.525974 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cc98-account-create-update-sjfqm"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.550110 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d97l\" (UniqueName: \"kubernetes.io/projected/8fa35355-06e1-403f-9691-92398769ac09-kube-api-access-8d97l\") pod \"nova-cell1-db-create-tbf9j\" (UID: \"8fa35355-06e1-403f-9691-92398769ac09\") " pod="openstack/nova-cell1-db-create-tbf9j" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.550187 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r55f9\" (UniqueName: \"kubernetes.io/projected/b5daba6a-a01a-4400-aa87-01f9efd3abd8-kube-api-access-r55f9\") pod \"nova-api-78eb-account-create-update-2dqgt\" (UID: \"b5daba6a-a01a-4400-aa87-01f9efd3abd8\") " pod="openstack/nova-api-78eb-account-create-update-2dqgt" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.550262 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5daba6a-a01a-4400-aa87-01f9efd3abd8-operator-scripts\") pod \"nova-api-78eb-account-create-update-2dqgt\" (UID: \"b5daba6a-a01a-4400-aa87-01f9efd3abd8\") " pod="openstack/nova-api-78eb-account-create-update-2dqgt" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.550298 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fa35355-06e1-403f-9691-92398769ac09-operator-scripts\") pod \"nova-cell1-db-create-tbf9j\" (UID: \"8fa35355-06e1-403f-9691-92398769ac09\") " pod="openstack/nova-cell1-db-create-tbf9j" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.654146 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d97l\" (UniqueName: \"kubernetes.io/projected/8fa35355-06e1-403f-9691-92398769ac09-kube-api-access-8d97l\") pod \"nova-cell1-db-create-tbf9j\" (UID: \"8fa35355-06e1-403f-9691-92398769ac09\") " pod="openstack/nova-cell1-db-create-tbf9j" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.654215 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r55f9\" (UniqueName: \"kubernetes.io/projected/b5daba6a-a01a-4400-aa87-01f9efd3abd8-kube-api-access-r55f9\") pod \"nova-api-78eb-account-create-update-2dqgt\" (UID: \"b5daba6a-a01a-4400-aa87-01f9efd3abd8\") " pod="openstack/nova-api-78eb-account-create-update-2dqgt" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.654252 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5daba6a-a01a-4400-aa87-01f9efd3abd8-operator-scripts\") pod \"nova-api-78eb-account-create-update-2dqgt\" (UID: \"b5daba6a-a01a-4400-aa87-01f9efd3abd8\") " pod="openstack/nova-api-78eb-account-create-update-2dqgt" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.654294 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fa35355-06e1-403f-9691-92398769ac09-operator-scripts\") pod \"nova-cell1-db-create-tbf9j\" (UID: \"8fa35355-06e1-403f-9691-92398769ac09\") " pod="openstack/nova-cell1-db-create-tbf9j" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.654373 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwv2d\" (UniqueName: \"kubernetes.io/projected/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-kube-api-access-jwv2d\") pod \"nova-cell0-cc98-account-create-update-sjfqm\" (UID: \"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c\") " pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.654416 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-operator-scripts\") pod \"nova-cell0-cc98-account-create-update-sjfqm\" (UID: \"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c\") " pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.656965 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fa35355-06e1-403f-9691-92398769ac09-operator-scripts\") pod \"nova-cell1-db-create-tbf9j\" (UID: \"8fa35355-06e1-403f-9691-92398769ac09\") " pod="openstack/nova-cell1-db-create-tbf9j" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.657637 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5daba6a-a01a-4400-aa87-01f9efd3abd8-operator-scripts\") pod \"nova-api-78eb-account-create-update-2dqgt\" (UID: \"b5daba6a-a01a-4400-aa87-01f9efd3abd8\") " pod="openstack/nova-api-78eb-account-create-update-2dqgt" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.689960 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r55f9\" (UniqueName: \"kubernetes.io/projected/b5daba6a-a01a-4400-aa87-01f9efd3abd8-kube-api-access-r55f9\") pod \"nova-api-78eb-account-create-update-2dqgt\" (UID: \"b5daba6a-a01a-4400-aa87-01f9efd3abd8\") " pod="openstack/nova-api-78eb-account-create-update-2dqgt" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.693199 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d97l\" (UniqueName: \"kubernetes.io/projected/8fa35355-06e1-403f-9691-92398769ac09-kube-api-access-8d97l\") pod \"nova-cell1-db-create-tbf9j\" (UID: \"8fa35355-06e1-403f-9691-92398769ac09\") " pod="openstack/nova-cell1-db-create-tbf9j" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.703081 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-f7d8-account-create-update-rp748"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.711799 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f7d8-account-create-update-rp748"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.711987 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f7d8-account-create-update-rp748" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.716037 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.742495 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-78eb-account-create-update-2dqgt" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.756173 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3db8af07-1310-4cd5-be07-3fd062fe89a7-operator-scripts\") pod \"nova-cell1-f7d8-account-create-update-rp748\" (UID: \"3db8af07-1310-4cd5-be07-3fd062fe89a7\") " pod="openstack/nova-cell1-f7d8-account-create-update-rp748" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.756254 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jk6c\" (UniqueName: \"kubernetes.io/projected/3db8af07-1310-4cd5-be07-3fd062fe89a7-kube-api-access-7jk6c\") pod \"nova-cell1-f7d8-account-create-update-rp748\" (UID: \"3db8af07-1310-4cd5-be07-3fd062fe89a7\") " pod="openstack/nova-cell1-f7d8-account-create-update-rp748" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.756311 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwv2d\" (UniqueName: \"kubernetes.io/projected/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-kube-api-access-jwv2d\") pod \"nova-cell0-cc98-account-create-update-sjfqm\" (UID: \"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c\") " pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.756340 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-operator-scripts\") pod \"nova-cell0-cc98-account-create-update-sjfqm\" (UID: \"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c\") " pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.757146 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-operator-scripts\") pod \"nova-cell0-cc98-account-create-update-sjfqm\" (UID: \"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c\") " pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.771457 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tbf9j" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.780470 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwv2d\" (UniqueName: \"kubernetes.io/projected/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-kube-api-access-jwv2d\") pod \"nova-cell0-cc98-account-create-update-sjfqm\" (UID: \"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c\") " pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.861844 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3db8af07-1310-4cd5-be07-3fd062fe89a7-operator-scripts\") pod \"nova-cell1-f7d8-account-create-update-rp748\" (UID: \"3db8af07-1310-4cd5-be07-3fd062fe89a7\") " pod="openstack/nova-cell1-f7d8-account-create-update-rp748" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.861945 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jk6c\" (UniqueName: \"kubernetes.io/projected/3db8af07-1310-4cd5-be07-3fd062fe89a7-kube-api-access-7jk6c\") pod \"nova-cell1-f7d8-account-create-update-rp748\" (UID: \"3db8af07-1310-4cd5-be07-3fd062fe89a7\") " pod="openstack/nova-cell1-f7d8-account-create-update-rp748" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.865728 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3db8af07-1310-4cd5-be07-3fd062fe89a7-operator-scripts\") pod \"nova-cell1-f7d8-account-create-update-rp748\" (UID: \"3db8af07-1310-4cd5-be07-3fd062fe89a7\") " pod="openstack/nova-cell1-f7d8-account-create-update-rp748" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.887815 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jk6c\" (UniqueName: \"kubernetes.io/projected/3db8af07-1310-4cd5-be07-3fd062fe89a7-kube-api-access-7jk6c\") pod \"nova-cell1-f7d8-account-create-update-rp748\" (UID: \"3db8af07-1310-4cd5-be07-3fd062fe89a7\") " pod="openstack/nova-cell1-f7d8-account-create-update-rp748" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.899435 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.043235 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f7d8-account-create-update-rp748" Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.098043 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tchwl"] Mar 09 13:42:07 crc kubenswrapper[4764]: W0309 13:42:07.106090 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66003ca3_e579_4dab_b714_b5b2baa26bad.slice/crio-1b4ee63c3be90245829c57ff18357c6790f696fc4922e58488a0b15d231d7670 WatchSource:0}: Error finding container 1b4ee63c3be90245829c57ff18357c6790f696fc4922e58488a0b15d231d7670: Status 404 returned error can't find the container with id 1b4ee63c3be90245829c57ff18357c6790f696fc4922e58488a0b15d231d7670 Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.183576 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mnqg7"] Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.357285 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tbf9j"] Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.389479 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-78eb-account-create-update-2dqgt"] Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.578791 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5ceebdd-e9ad-472a-8806-f5b441ced89a" path="/var/lib/kubelet/pods/b5ceebdd-e9ad-472a-8806-f5b441ced89a/volumes" Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.667949 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cc98-account-create-update-sjfqm"] Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.754470 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f7d8-account-create-update-rp748"] Mar 09 13:42:07 crc kubenswrapper[4764]: W0309 13:42:07.767782 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3db8af07_1310_4cd5_be07_3fd062fe89a7.slice/crio-460f2f7700725f04372e42b1bbec425c0f8c5db8d2dfaf33891c7041869dfce9 WatchSource:0}: Error finding container 460f2f7700725f04372e42b1bbec425c0f8c5db8d2dfaf33891c7041869dfce9: Status 404 returned error can't find the container with id 460f2f7700725f04372e42b1bbec425c0f8c5db8d2dfaf33891c7041869dfce9 Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.768757 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-78eb-account-create-update-2dqgt" event={"ID":"b5daba6a-a01a-4400-aa87-01f9efd3abd8","Type":"ContainerStarted","Data":"68e1ea9724cff446a48722f5d3beb35f95107e4f92a2482a35467b3ee65b6d69"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.768824 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-78eb-account-create-update-2dqgt" event={"ID":"b5daba6a-a01a-4400-aa87-01f9efd3abd8","Type":"ContainerStarted","Data":"8b9629cb75484588167c111d3c7dbfcae1e4f0827b9d156b8c69932b90498b31"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.771941 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" event={"ID":"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c","Type":"ContainerStarted","Data":"d57d5ab16c67bc105085ecfbab0c639605f83f267c2a597f6e63924c995e5ba1"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.792516 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tbf9j" event={"ID":"8fa35355-06e1-403f-9691-92398769ac09","Type":"ContainerStarted","Data":"ea1b24536350a8debd8d3a73db534419bac8b767469b0a62084cf6812d4fbd33"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.792578 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tbf9j" event={"ID":"8fa35355-06e1-403f-9691-92398769ac09","Type":"ContainerStarted","Data":"7802efe10177c9d393bd5a30dc7dd873fbe98ff3606d74449c6a46a5501b117c"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.801616 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-78eb-account-create-update-2dqgt" podStartSLOduration=1.801591232 podStartE2EDuration="1.801591232s" podCreationTimestamp="2026-03-09 13:42:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:42:07.793339565 +0000 UTC m=+1283.043511483" watchObservedRunningTime="2026-03-09 13:42:07.801591232 +0000 UTC m=+1283.051763140" Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.809198 4764 generic.go:334] "Generic (PLEG): container finished" podID="50610296-d076-4c9f-ac34-a976202ce135" containerID="40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9" exitCode=0 Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.809352 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b74bc6bc6-vsxl5" event={"ID":"50610296-d076-4c9f-ac34-a976202ce135","Type":"ContainerDied","Data":"40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.830135 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tchwl" event={"ID":"66003ca3-e579-4dab-b714-b5b2baa26bad","Type":"ContainerStarted","Data":"5ac07a8dfc2c0d6d538f5fde06f5c7806d78d08f4964c253a85bccef288caf3e"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.830203 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tchwl" event={"ID":"66003ca3-e579-4dab-b714-b5b2baa26bad","Type":"ContainerStarted","Data":"1b4ee63c3be90245829c57ff18357c6790f696fc4922e58488a0b15d231d7670"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.838258 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-tbf9j" podStartSLOduration=1.838234111 podStartE2EDuration="1.838234111s" podCreationTimestamp="2026-03-09 13:42:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:42:07.832461046 +0000 UTC m=+1283.082632954" watchObservedRunningTime="2026-03-09 13:42:07.838234111 +0000 UTC m=+1283.088406039" Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.845267 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cafd43e-a12e-46ee-8108-8e33d10c47ee","Type":"ContainerStarted","Data":"5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.845624 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="ceilometer-central-agent" containerID="cri-o://8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2" gracePeriod=30 Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.845701 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.845799 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="proxy-httpd" containerID="cri-o://5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679" gracePeriod=30 Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.845874 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="sg-core" containerID="cri-o://ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9" gracePeriod=30 Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.845933 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="ceilometer-notification-agent" containerID="cri-o://77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76" gracePeriod=30 Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.853811 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mnqg7" event={"ID":"a75ea85a-1e66-4e8d-92d7-6f9b766abfda","Type":"ContainerStarted","Data":"17c99031bb7ad39e3a6bf2e953eeaec9098edceedb7c2ac4d40bb3b9857101a1"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.853875 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mnqg7" event={"ID":"a75ea85a-1e66-4e8d-92d7-6f9b766abfda","Type":"ContainerStarted","Data":"ab77b0a3ad0dcdcd518820673440ab1d67aaea30117cd3647b19e8df2313926c"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.944809 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.59244328 podStartE2EDuration="6.944779831s" podCreationTimestamp="2026-03-09 13:42:01 +0000 UTC" firstStartedPulling="2026-03-09 13:42:02.516159824 +0000 UTC m=+1277.766331732" lastFinishedPulling="2026-03-09 13:42:06.868496375 +0000 UTC m=+1282.118668283" observedRunningTime="2026-03-09 13:42:07.887892045 +0000 UTC m=+1283.138063953" watchObservedRunningTime="2026-03-09 13:42:07.944779831 +0000 UTC m=+1283.194951749" Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.957939 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-mnqg7" podStartSLOduration=1.95788655 podStartE2EDuration="1.95788655s" podCreationTimestamp="2026-03-09 13:42:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:42:07.91040232 +0000 UTC m=+1283.160574228" watchObservedRunningTime="2026-03-09 13:42:07.95788655 +0000 UTC m=+1283.208058468" Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.878795 4764 generic.go:334] "Generic (PLEG): container finished" podID="b5daba6a-a01a-4400-aa87-01f9efd3abd8" containerID="68e1ea9724cff446a48722f5d3beb35f95107e4f92a2482a35467b3ee65b6d69" exitCode=0 Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.879246 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-78eb-account-create-update-2dqgt" event={"ID":"b5daba6a-a01a-4400-aa87-01f9efd3abd8","Type":"ContainerDied","Data":"68e1ea9724cff446a48722f5d3beb35f95107e4f92a2482a35467b3ee65b6d69"} Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.891239 4764 generic.go:334] "Generic (PLEG): container finished" podID="5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c" containerID="2c7f017bce7c92c14d6be609c4899f3333d712b1c6dc036ede4a982ad2477f70" exitCode=0 Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.891362 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" event={"ID":"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c","Type":"ContainerDied","Data":"2c7f017bce7c92c14d6be609c4899f3333d712b1c6dc036ede4a982ad2477f70"} Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.903482 4764 generic.go:334] "Generic (PLEG): container finished" podID="3db8af07-1310-4cd5-be07-3fd062fe89a7" containerID="d053ca844a6ffecdf233cee0796c2188368f4f6b2a5097474b6a14b438549f4b" exitCode=0 Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.903618 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f7d8-account-create-update-rp748" event={"ID":"3db8af07-1310-4cd5-be07-3fd062fe89a7","Type":"ContainerDied","Data":"d053ca844a6ffecdf233cee0796c2188368f4f6b2a5097474b6a14b438549f4b"} Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.903691 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f7d8-account-create-update-rp748" event={"ID":"3db8af07-1310-4cd5-be07-3fd062fe89a7","Type":"ContainerStarted","Data":"460f2f7700725f04372e42b1bbec425c0f8c5db8d2dfaf33891c7041869dfce9"} Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.915454 4764 generic.go:334] "Generic (PLEG): container finished" podID="8fa35355-06e1-403f-9691-92398769ac09" containerID="ea1b24536350a8debd8d3a73db534419bac8b767469b0a62084cf6812d4fbd33" exitCode=0 Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.915534 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tbf9j" event={"ID":"8fa35355-06e1-403f-9691-92398769ac09","Type":"ContainerDied","Data":"ea1b24536350a8debd8d3a73db534419bac8b767469b0a62084cf6812d4fbd33"} Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.922458 4764 generic.go:334] "Generic (PLEG): container finished" podID="66003ca3-e579-4dab-b714-b5b2baa26bad" containerID="5ac07a8dfc2c0d6d538f5fde06f5c7806d78d08f4964c253a85bccef288caf3e" exitCode=0 Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.922656 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tchwl" event={"ID":"66003ca3-e579-4dab-b714-b5b2baa26bad","Type":"ContainerDied","Data":"5ac07a8dfc2c0d6d538f5fde06f5c7806d78d08f4964c253a85bccef288caf3e"} Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.948608 4764 generic.go:334] "Generic (PLEG): container finished" podID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerID="5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679" exitCode=0 Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.948663 4764 generic.go:334] "Generic (PLEG): container finished" podID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerID="ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9" exitCode=2 Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.948673 4764 generic.go:334] "Generic (PLEG): container finished" podID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerID="77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76" exitCode=0 Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.948749 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cafd43e-a12e-46ee-8108-8e33d10c47ee","Type":"ContainerDied","Data":"5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679"} Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.948783 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cafd43e-a12e-46ee-8108-8e33d10c47ee","Type":"ContainerDied","Data":"ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9"} Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.948798 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cafd43e-a12e-46ee-8108-8e33d10c47ee","Type":"ContainerDied","Data":"77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76"} Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.969457 4764 generic.go:334] "Generic (PLEG): container finished" podID="a75ea85a-1e66-4e8d-92d7-6f9b766abfda" containerID="17c99031bb7ad39e3a6bf2e953eeaec9098edceedb7c2ac4d40bb3b9857101a1" exitCode=0 Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.969600 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mnqg7" event={"ID":"a75ea85a-1e66-4e8d-92d7-6f9b766abfda","Type":"ContainerDied","Data":"17c99031bb7ad39e3a6bf2e953eeaec9098edceedb7c2ac4d40bb3b9857101a1"} Mar 09 13:42:09 crc kubenswrapper[4764]: I0309 13:42:09.328141 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tchwl" Mar 09 13:42:09 crc kubenswrapper[4764]: I0309 13:42:09.434839 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwkjb\" (UniqueName: \"kubernetes.io/projected/66003ca3-e579-4dab-b714-b5b2baa26bad-kube-api-access-dwkjb\") pod \"66003ca3-e579-4dab-b714-b5b2baa26bad\" (UID: \"66003ca3-e579-4dab-b714-b5b2baa26bad\") " Mar 09 13:42:09 crc kubenswrapper[4764]: I0309 13:42:09.434990 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66003ca3-e579-4dab-b714-b5b2baa26bad-operator-scripts\") pod \"66003ca3-e579-4dab-b714-b5b2baa26bad\" (UID: \"66003ca3-e579-4dab-b714-b5b2baa26bad\") " Mar 09 13:42:09 crc kubenswrapper[4764]: I0309 13:42:09.435438 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66003ca3-e579-4dab-b714-b5b2baa26bad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66003ca3-e579-4dab-b714-b5b2baa26bad" (UID: "66003ca3-e579-4dab-b714-b5b2baa26bad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:42:09 crc kubenswrapper[4764]: I0309 13:42:09.441558 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66003ca3-e579-4dab-b714-b5b2baa26bad-kube-api-access-dwkjb" (OuterVolumeSpecName: "kube-api-access-dwkjb") pod "66003ca3-e579-4dab-b714-b5b2baa26bad" (UID: "66003ca3-e579-4dab-b714-b5b2baa26bad"). InnerVolumeSpecName "kube-api-access-dwkjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:09 crc kubenswrapper[4764]: I0309 13:42:09.537764 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwkjb\" (UniqueName: \"kubernetes.io/projected/66003ca3-e579-4dab-b714-b5b2baa26bad-kube-api-access-dwkjb\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:09 crc kubenswrapper[4764]: I0309 13:42:09.538347 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66003ca3-e579-4dab-b714-b5b2baa26bad-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:09 crc kubenswrapper[4764]: I0309 13:42:09.979928 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tchwl" event={"ID":"66003ca3-e579-4dab-b714-b5b2baa26bad","Type":"ContainerDied","Data":"1b4ee63c3be90245829c57ff18357c6790f696fc4922e58488a0b15d231d7670"} Mar 09 13:42:09 crc kubenswrapper[4764]: I0309 13:42:09.979986 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b4ee63c3be90245829c57ff18357c6790f696fc4922e58488a0b15d231d7670" Mar 09 13:42:09 crc kubenswrapper[4764]: I0309 13:42:09.980195 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tchwl" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.458429 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f7d8-account-create-update-rp748" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.560069 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jk6c\" (UniqueName: \"kubernetes.io/projected/3db8af07-1310-4cd5-be07-3fd062fe89a7-kube-api-access-7jk6c\") pod \"3db8af07-1310-4cd5-be07-3fd062fe89a7\" (UID: \"3db8af07-1310-4cd5-be07-3fd062fe89a7\") " Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.560134 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3db8af07-1310-4cd5-be07-3fd062fe89a7-operator-scripts\") pod \"3db8af07-1310-4cd5-be07-3fd062fe89a7\" (UID: \"3db8af07-1310-4cd5-be07-3fd062fe89a7\") " Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.561113 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3db8af07-1310-4cd5-be07-3fd062fe89a7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3db8af07-1310-4cd5-be07-3fd062fe89a7" (UID: "3db8af07-1310-4cd5-be07-3fd062fe89a7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.567571 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db8af07-1310-4cd5-be07-3fd062fe89a7-kube-api-access-7jk6c" (OuterVolumeSpecName: "kube-api-access-7jk6c") pod "3db8af07-1310-4cd5-be07-3fd062fe89a7" (UID: "3db8af07-1310-4cd5-be07-3fd062fe89a7"). InnerVolumeSpecName "kube-api-access-7jk6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.669156 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jk6c\" (UniqueName: \"kubernetes.io/projected/3db8af07-1310-4cd5-be07-3fd062fe89a7-kube-api-access-7jk6c\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.669498 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3db8af07-1310-4cd5-be07-3fd062fe89a7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.678533 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-78eb-account-create-update-2dqgt" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.687756 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mnqg7" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.696160 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tbf9j" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.712176 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.770553 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5daba6a-a01a-4400-aa87-01f9efd3abd8-operator-scripts\") pod \"b5daba6a-a01a-4400-aa87-01f9efd3abd8\" (UID: \"b5daba6a-a01a-4400-aa87-01f9efd3abd8\") " Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.770674 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-operator-scripts\") pod \"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c\" (UID: \"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c\") " Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.770717 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r55f9\" (UniqueName: \"kubernetes.io/projected/b5daba6a-a01a-4400-aa87-01f9efd3abd8-kube-api-access-r55f9\") pod \"b5daba6a-a01a-4400-aa87-01f9efd3abd8\" (UID: \"b5daba6a-a01a-4400-aa87-01f9efd3abd8\") " Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.770869 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fa35355-06e1-403f-9691-92398769ac09-operator-scripts\") pod \"8fa35355-06e1-403f-9691-92398769ac09\" (UID: \"8fa35355-06e1-403f-9691-92398769ac09\") " Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.770923 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2ffg\" (UniqueName: \"kubernetes.io/projected/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-kube-api-access-t2ffg\") pod \"a75ea85a-1e66-4e8d-92d7-6f9b766abfda\" (UID: \"a75ea85a-1e66-4e8d-92d7-6f9b766abfda\") " Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.770985 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwv2d\" (UniqueName: \"kubernetes.io/projected/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-kube-api-access-jwv2d\") pod \"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c\" (UID: \"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c\") " Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.771040 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-operator-scripts\") pod \"a75ea85a-1e66-4e8d-92d7-6f9b766abfda\" (UID: \"a75ea85a-1e66-4e8d-92d7-6f9b766abfda\") " Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.771128 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d97l\" (UniqueName: \"kubernetes.io/projected/8fa35355-06e1-403f-9691-92398769ac09-kube-api-access-8d97l\") pod \"8fa35355-06e1-403f-9691-92398769ac09\" (UID: \"8fa35355-06e1-403f-9691-92398769ac09\") " Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.771219 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5daba6a-a01a-4400-aa87-01f9efd3abd8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b5daba6a-a01a-4400-aa87-01f9efd3abd8" (UID: "b5daba6a-a01a-4400-aa87-01f9efd3abd8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.771231 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c" (UID: "5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.771292 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fa35355-06e1-403f-9691-92398769ac09-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8fa35355-06e1-403f-9691-92398769ac09" (UID: "8fa35355-06e1-403f-9691-92398769ac09"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.771720 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a75ea85a-1e66-4e8d-92d7-6f9b766abfda" (UID: "a75ea85a-1e66-4e8d-92d7-6f9b766abfda"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.771758 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5daba6a-a01a-4400-aa87-01f9efd3abd8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.771776 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.771790 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fa35355-06e1-403f-9691-92398769ac09-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.776853 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-kube-api-access-jwv2d" (OuterVolumeSpecName: "kube-api-access-jwv2d") pod "5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c" (UID: "5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c"). InnerVolumeSpecName "kube-api-access-jwv2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.783094 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fa35355-06e1-403f-9691-92398769ac09-kube-api-access-8d97l" (OuterVolumeSpecName: "kube-api-access-8d97l") pod "8fa35355-06e1-403f-9691-92398769ac09" (UID: "8fa35355-06e1-403f-9691-92398769ac09"). InnerVolumeSpecName "kube-api-access-8d97l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.783449 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-kube-api-access-t2ffg" (OuterVolumeSpecName: "kube-api-access-t2ffg") pod "a75ea85a-1e66-4e8d-92d7-6f9b766abfda" (UID: "a75ea85a-1e66-4e8d-92d7-6f9b766abfda"). InnerVolumeSpecName "kube-api-access-t2ffg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.786945 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5daba6a-a01a-4400-aa87-01f9efd3abd8-kube-api-access-r55f9" (OuterVolumeSpecName: "kube-api-access-r55f9") pod "b5daba6a-a01a-4400-aa87-01f9efd3abd8" (UID: "b5daba6a-a01a-4400-aa87-01f9efd3abd8"). InnerVolumeSpecName "kube-api-access-r55f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.873836 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r55f9\" (UniqueName: \"kubernetes.io/projected/b5daba6a-a01a-4400-aa87-01f9efd3abd8-kube-api-access-r55f9\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.873868 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2ffg\" (UniqueName: \"kubernetes.io/projected/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-kube-api-access-t2ffg\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.873877 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwv2d\" (UniqueName: \"kubernetes.io/projected/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-kube-api-access-jwv2d\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.873888 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.873899 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d97l\" (UniqueName: \"kubernetes.io/projected/8fa35355-06e1-403f-9691-92398769ac09-kube-api-access-8d97l\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.017958 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-78eb-account-create-update-2dqgt" event={"ID":"b5daba6a-a01a-4400-aa87-01f9efd3abd8","Type":"ContainerDied","Data":"8b9629cb75484588167c111d3c7dbfcae1e4f0827b9d156b8c69932b90498b31"} Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.018006 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b9629cb75484588167c111d3c7dbfcae1e4f0827b9d156b8c69932b90498b31" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.018085 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-78eb-account-create-update-2dqgt" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.024464 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.024546 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" event={"ID":"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c","Type":"ContainerDied","Data":"d57d5ab16c67bc105085ecfbab0c639605f83f267c2a597f6e63924c995e5ba1"} Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.024605 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d57d5ab16c67bc105085ecfbab0c639605f83f267c2a597f6e63924c995e5ba1" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.026304 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f7d8-account-create-update-rp748" event={"ID":"3db8af07-1310-4cd5-be07-3fd062fe89a7","Type":"ContainerDied","Data":"460f2f7700725f04372e42b1bbec425c0f8c5db8d2dfaf33891c7041869dfce9"} Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.026356 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="460f2f7700725f04372e42b1bbec425c0f8c5db8d2dfaf33891c7041869dfce9" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.026430 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f7d8-account-create-update-rp748" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.034103 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tbf9j" event={"ID":"8fa35355-06e1-403f-9691-92398769ac09","Type":"ContainerDied","Data":"7802efe10177c9d393bd5a30dc7dd873fbe98ff3606d74449c6a46a5501b117c"} Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.034157 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7802efe10177c9d393bd5a30dc7dd873fbe98ff3606d74449c6a46a5501b117c" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.034274 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tbf9j" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.038114 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mnqg7" event={"ID":"a75ea85a-1e66-4e8d-92d7-6f9b766abfda","Type":"ContainerDied","Data":"ab77b0a3ad0dcdcd518820673440ab1d67aaea30117cd3647b19e8df2313926c"} Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.038163 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab77b0a3ad0dcdcd518820673440ab1d67aaea30117cd3647b19e8df2313926c" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.038259 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mnqg7" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.712470 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.790923 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-config\") pod \"50610296-d076-4c9f-ac34-a976202ce135\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.791001 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-ovndb-tls-certs\") pod \"50610296-d076-4c9f-ac34-a976202ce135\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.791185 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-httpd-config\") pod \"50610296-d076-4c9f-ac34-a976202ce135\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.791241 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnqrq\" (UniqueName: \"kubernetes.io/projected/50610296-d076-4c9f-ac34-a976202ce135-kube-api-access-xnqrq\") pod \"50610296-d076-4c9f-ac34-a976202ce135\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.791352 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-combined-ca-bundle\") pod \"50610296-d076-4c9f-ac34-a976202ce135\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.812013 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50610296-d076-4c9f-ac34-a976202ce135-kube-api-access-xnqrq" (OuterVolumeSpecName: "kube-api-access-xnqrq") pod "50610296-d076-4c9f-ac34-a976202ce135" (UID: "50610296-d076-4c9f-ac34-a976202ce135"). InnerVolumeSpecName "kube-api-access-xnqrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.812017 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "50610296-d076-4c9f-ac34-a976202ce135" (UID: "50610296-d076-4c9f-ac34-a976202ce135"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.864632 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50610296-d076-4c9f-ac34-a976202ce135" (UID: "50610296-d076-4c9f-ac34-a976202ce135"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.873229 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-config" (OuterVolumeSpecName: "config") pod "50610296-d076-4c9f-ac34-a976202ce135" (UID: "50610296-d076-4c9f-ac34-a976202ce135"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.879844 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "50610296-d076-4c9f-ac34-a976202ce135" (UID: "50610296-d076-4c9f-ac34-a976202ce135"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.896221 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnqrq\" (UniqueName: \"kubernetes.io/projected/50610296-d076-4c9f-ac34-a976202ce135-kube-api-access-xnqrq\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.896266 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.896278 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.896288 4764 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.896301 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.048630 4764 generic.go:334] "Generic (PLEG): container finished" podID="50610296-d076-4c9f-ac34-a976202ce135" containerID="508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed" exitCode=0 Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.048697 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b74bc6bc6-vsxl5" event={"ID":"50610296-d076-4c9f-ac34-a976202ce135","Type":"ContainerDied","Data":"508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed"} Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.048724 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.048732 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b74bc6bc6-vsxl5" event={"ID":"50610296-d076-4c9f-ac34-a976202ce135","Type":"ContainerDied","Data":"bd4c896bc38b604cb19726769c37db30c4145f3642057a166913e3d7cfd24c8f"} Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.048752 4764 scope.go:117] "RemoveContainer" containerID="40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9" Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.078168 4764 scope.go:117] "RemoveContainer" containerID="508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed" Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.093092 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6b74bc6bc6-vsxl5"] Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.104285 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6b74bc6bc6-vsxl5"] Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.112377 4764 scope.go:117] "RemoveContainer" containerID="40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9" Mar 09 13:42:12 crc kubenswrapper[4764]: E0309 13:42:12.113048 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9\": container with ID starting with 40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9 not found: ID does not exist" containerID="40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9" Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.113099 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9"} err="failed to get container status \"40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9\": rpc error: code = NotFound desc = could not find container \"40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9\": container with ID starting with 40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9 not found: ID does not exist" Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.113126 4764 scope.go:117] "RemoveContainer" containerID="508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed" Mar 09 13:42:12 crc kubenswrapper[4764]: E0309 13:42:12.113542 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed\": container with ID starting with 508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed not found: ID does not exist" containerID="508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed" Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.113586 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed"} err="failed to get container status \"508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed\": rpc error: code = NotFound desc = could not find container \"508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed\": container with ID starting with 508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed not found: ID does not exist" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.376920 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.426563 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-log-httpd\") pod \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.427099 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-combined-ca-bundle\") pod \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.427178 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-sg-core-conf-yaml\") pod \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.427213 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-run-httpd\") pod \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.427276 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-config-data\") pod \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.427300 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-scripts\") pod \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.427387 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x677c\" (UniqueName: \"kubernetes.io/projected/4cafd43e-a12e-46ee-8108-8e33d10c47ee-kube-api-access-x677c\") pod \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.432502 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4cafd43e-a12e-46ee-8108-8e33d10c47ee" (UID: "4cafd43e-a12e-46ee-8108-8e33d10c47ee"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.435593 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cafd43e-a12e-46ee-8108-8e33d10c47ee-kube-api-access-x677c" (OuterVolumeSpecName: "kube-api-access-x677c") pod "4cafd43e-a12e-46ee-8108-8e33d10c47ee" (UID: "4cafd43e-a12e-46ee-8108-8e33d10c47ee"). InnerVolumeSpecName "kube-api-access-x677c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.435929 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4cafd43e-a12e-46ee-8108-8e33d10c47ee" (UID: "4cafd43e-a12e-46ee-8108-8e33d10c47ee"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.442869 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-scripts" (OuterVolumeSpecName: "scripts") pod "4cafd43e-a12e-46ee-8108-8e33d10c47ee" (UID: "4cafd43e-a12e-46ee-8108-8e33d10c47ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.461707 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4cafd43e-a12e-46ee-8108-8e33d10c47ee" (UID: "4cafd43e-a12e-46ee-8108-8e33d10c47ee"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.529093 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cafd43e-a12e-46ee-8108-8e33d10c47ee" (UID: "4cafd43e-a12e-46ee-8108-8e33d10c47ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.529634 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x677c\" (UniqueName: \"kubernetes.io/projected/4cafd43e-a12e-46ee-8108-8e33d10c47ee-kube-api-access-x677c\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.529688 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.529701 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.529713 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.529723 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.529733 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.536955 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-config-data" (OuterVolumeSpecName: "config-data") pod "4cafd43e-a12e-46ee-8108-8e33d10c47ee" (UID: "4cafd43e-a12e-46ee-8108-8e33d10c47ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.574229 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50610296-d076-4c9f-ac34-a976202ce135" path="/var/lib/kubelet/pods/50610296-d076-4c9f-ac34-a976202ce135/volumes" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.631227 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.992148 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.078854 4764 generic.go:334] "Generic (PLEG): container finished" podID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerID="8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2" exitCode=0 Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.078924 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cafd43e-a12e-46ee-8108-8e33d10c47ee","Type":"ContainerDied","Data":"8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2"} Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.078968 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cafd43e-a12e-46ee-8108-8e33d10c47ee","Type":"ContainerDied","Data":"5953a3b710032064c75479dd87c9f1404c8431fcc42a83d5bac920c5b52ac282"} Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.079020 4764 scope.go:117] "RemoveContainer" containerID="5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.079128 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.128741 4764 scope.go:117] "RemoveContainer" containerID="ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.130313 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.150838 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.159822 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160308 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5daba6a-a01a-4400-aa87-01f9efd3abd8" containerName="mariadb-account-create-update" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160330 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5daba6a-a01a-4400-aa87-01f9efd3abd8" containerName="mariadb-account-create-update" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160344 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75ea85a-1e66-4e8d-92d7-6f9b766abfda" containerName="mariadb-database-create" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160351 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75ea85a-1e66-4e8d-92d7-6f9b766abfda" containerName="mariadb-database-create" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160363 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="sg-core" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160370 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="sg-core" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160386 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c" containerName="mariadb-account-create-update" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160392 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c" containerName="mariadb-account-create-update" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160403 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50610296-d076-4c9f-ac34-a976202ce135" containerName="neutron-httpd" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160410 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="50610296-d076-4c9f-ac34-a976202ce135" containerName="neutron-httpd" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160420 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66003ca3-e579-4dab-b714-b5b2baa26bad" containerName="mariadb-database-create" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160427 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="66003ca3-e579-4dab-b714-b5b2baa26bad" containerName="mariadb-database-create" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160442 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="proxy-httpd" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160449 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="proxy-httpd" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160463 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="ceilometer-central-agent" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160468 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="ceilometer-central-agent" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160479 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="ceilometer-notification-agent" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160485 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="ceilometer-notification-agent" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160495 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50610296-d076-4c9f-ac34-a976202ce135" containerName="neutron-api" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160502 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="50610296-d076-4c9f-ac34-a976202ce135" containerName="neutron-api" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160508 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db8af07-1310-4cd5-be07-3fd062fe89a7" containerName="mariadb-account-create-update" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160516 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db8af07-1310-4cd5-be07-3fd062fe89a7" containerName="mariadb-account-create-update" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160525 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa35355-06e1-403f-9691-92398769ac09" containerName="mariadb-database-create" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160531 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa35355-06e1-403f-9691-92398769ac09" containerName="mariadb-database-create" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160767 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="ceilometer-central-agent" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160781 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="50610296-d076-4c9f-ac34-a976202ce135" containerName="neutron-httpd" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160795 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="66003ca3-e579-4dab-b714-b5b2baa26bad" containerName="mariadb-database-create" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160810 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="sg-core" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160824 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a75ea85a-1e66-4e8d-92d7-6f9b766abfda" containerName="mariadb-database-create" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160832 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="proxy-httpd" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160841 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db8af07-1310-4cd5-be07-3fd062fe89a7" containerName="mariadb-account-create-update" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160852 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5daba6a-a01a-4400-aa87-01f9efd3abd8" containerName="mariadb-account-create-update" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160864 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="ceilometer-notification-agent" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160874 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="50610296-d076-4c9f-ac34-a976202ce135" containerName="neutron-api" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160886 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fa35355-06e1-403f-9691-92398769ac09" containerName="mariadb-database-create" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160898 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c" containerName="mariadb-account-create-update" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.162462 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.167105 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.167442 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.167467 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.169775 4764 scope.go:117] "RemoveContainer" containerID="77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.173112 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.199322 4764 scope.go:117] "RemoveContainer" containerID="8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.216947 4764 scope.go:117] "RemoveContainer" containerID="5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.217403 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679\": container with ID starting with 5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679 not found: ID does not exist" containerID="5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.217436 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679"} err="failed to get container status \"5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679\": rpc error: code = NotFound desc = could not find container \"5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679\": container with ID starting with 5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679 not found: ID does not exist" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.217458 4764 scope.go:117] "RemoveContainer" containerID="ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.217677 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9\": container with ID starting with ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9 not found: ID does not exist" containerID="ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.217697 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9"} err="failed to get container status \"ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9\": rpc error: code = NotFound desc = could not find container \"ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9\": container with ID starting with ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9 not found: ID does not exist" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.217709 4764 scope.go:117] "RemoveContainer" containerID="77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.217939 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76\": container with ID starting with 77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76 not found: ID does not exist" containerID="77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.217961 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76"} err="failed to get container status \"77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76\": rpc error: code = NotFound desc = could not find container \"77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76\": container with ID starting with 77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76 not found: ID does not exist" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.217973 4764 scope.go:117] "RemoveContainer" containerID="8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.218187 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2\": container with ID starting with 8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2 not found: ID does not exist" containerID="8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.218206 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2"} err="failed to get container status \"8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2\": rpc error: code = NotFound desc = could not find container \"8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2\": container with ID starting with 8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2 not found: ID does not exist" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.247704 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-scripts\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.247900 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.248078 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.248437 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-log-httpd\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.248556 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-config-data\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.248637 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-run-httpd\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.248869 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m8ll\" (UniqueName: \"kubernetes.io/projected/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-kube-api-access-4m8ll\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.248981 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.350607 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-log-httpd\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.350688 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-config-data\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.350713 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-run-httpd\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.350766 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m8ll\" (UniqueName: \"kubernetes.io/projected/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-kube-api-access-4m8ll\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.350803 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.350837 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-scripts\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.350865 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.350897 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.351745 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-run-httpd\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.352282 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-log-httpd\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.355930 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.356502 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.357543 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-scripts\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.365502 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-config-data\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.371008 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m8ll\" (UniqueName: \"kubernetes.io/projected/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-kube-api-access-4m8ll\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.377879 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.481417 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.935920 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:15 crc kubenswrapper[4764]: I0309 13:42:15.089864 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e78a4ead-5459-49a9-89f6-5e21ac1baa3c","Type":"ContainerStarted","Data":"79e7c65f033121e5c29021bfaba325c95dd7684d4ddbd0796fab12986b9aed27"} Mar 09 13:42:15 crc kubenswrapper[4764]: I0309 13:42:15.584076 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" path="/var/lib/kubelet/pods/4cafd43e-a12e-46ee-8108-8e33d10c47ee/volumes" Mar 09 13:42:16 crc kubenswrapper[4764]: I0309 13:42:16.119420 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e78a4ead-5459-49a9-89f6-5e21ac1baa3c","Type":"ContainerStarted","Data":"bb420468c16a3231754e16ddfa5b5dad7573dcd0fff5049f6f34e1d807166b01"} Mar 09 13:42:16 crc kubenswrapper[4764]: I0309 13:42:16.883938 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kkcml"] Mar 09 13:42:16 crc kubenswrapper[4764]: I0309 13:42:16.885864 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:16 crc kubenswrapper[4764]: I0309 13:42:16.889124 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 09 13:42:16 crc kubenswrapper[4764]: I0309 13:42:16.889370 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 09 13:42:16 crc kubenswrapper[4764]: I0309 13:42:16.897546 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-v8mzr" Mar 09 13:42:16 crc kubenswrapper[4764]: I0309 13:42:16.908871 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kkcml"] Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.013064 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-scripts\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.013420 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-config-data\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.013452 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grsz2\" (UniqueName: \"kubernetes.io/projected/c0a40476-ff1d-443d-846f-a54cd956aaa3-kube-api-access-grsz2\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.013533 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.115750 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grsz2\" (UniqueName: \"kubernetes.io/projected/c0a40476-ff1d-443d-846f-a54cd956aaa3-kube-api-access-grsz2\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.115911 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.115985 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-scripts\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.116037 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-config-data\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.123397 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.125525 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-scripts\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.125808 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-config-data\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.133496 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e78a4ead-5459-49a9-89f6-5e21ac1baa3c","Type":"ContainerStarted","Data":"b214631e329516c4f150fb9c3cadae477dc2a4812e6916faa36fc494a5d835b8"} Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.133534 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e78a4ead-5459-49a9-89f6-5e21ac1baa3c","Type":"ContainerStarted","Data":"1c0d0d684bf2cb708c8f8559ea9b7bec1b048fb2e4835892b259f454ffd824ea"} Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.140439 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grsz2\" (UniqueName: \"kubernetes.io/projected/c0a40476-ff1d-443d-846f-a54cd956aaa3-kube-api-access-grsz2\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.206809 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.688636 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kkcml"] Mar 09 13:42:18 crc kubenswrapper[4764]: I0309 13:42:18.161620 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kkcml" event={"ID":"c0a40476-ff1d-443d-846f-a54cd956aaa3","Type":"ContainerStarted","Data":"185f2547ce3fbe2885edcb4f82761dd8558e20b523bd1229f310adc3829d04d5"} Mar 09 13:42:19 crc kubenswrapper[4764]: I0309 13:42:19.175620 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e78a4ead-5459-49a9-89f6-5e21ac1baa3c","Type":"ContainerStarted","Data":"bed8beb8204aa72aa2bde54753848668402830e464bf5f8c233fd5f8d9ccc674"} Mar 09 13:42:19 crc kubenswrapper[4764]: I0309 13:42:19.176157 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:42:19 crc kubenswrapper[4764]: I0309 13:42:19.208709 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.250911549 podStartE2EDuration="5.20868431s" podCreationTimestamp="2026-03-09 13:42:14 +0000 UTC" firstStartedPulling="2026-03-09 13:42:14.949379241 +0000 UTC m=+1290.199551159" lastFinishedPulling="2026-03-09 13:42:18.907152012 +0000 UTC m=+1294.157323920" observedRunningTime="2026-03-09 13:42:19.199311555 +0000 UTC m=+1294.449483473" watchObservedRunningTime="2026-03-09 13:42:19.20868431 +0000 UTC m=+1294.458856218" Mar 09 13:42:26 crc kubenswrapper[4764]: I0309 13:42:26.265164 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kkcml" event={"ID":"c0a40476-ff1d-443d-846f-a54cd956aaa3","Type":"ContainerStarted","Data":"01f8a44cba8ccff51deb3a73f62a17d6b208f1f8fb17e65f80892f3b0a90b431"} Mar 09 13:42:26 crc kubenswrapper[4764]: I0309 13:42:26.288017 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-kkcml" podStartSLOduration=2.04921348 podStartE2EDuration="10.287999013s" podCreationTimestamp="2026-03-09 13:42:16 +0000 UTC" firstStartedPulling="2026-03-09 13:42:17.711550105 +0000 UTC m=+1292.961722013" lastFinishedPulling="2026-03-09 13:42:25.950335638 +0000 UTC m=+1301.200507546" observedRunningTime="2026-03-09 13:42:26.281768947 +0000 UTC m=+1301.531940855" watchObservedRunningTime="2026-03-09 13:42:26.287999013 +0000 UTC m=+1301.538170921" Mar 09 13:42:30 crc kubenswrapper[4764]: I0309 13:42:30.998886 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:31 crc kubenswrapper[4764]: I0309 13:42:31.000203 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="sg-core" containerID="cri-o://b214631e329516c4f150fb9c3cadae477dc2a4812e6916faa36fc494a5d835b8" gracePeriod=30 Mar 09 13:42:31 crc kubenswrapper[4764]: I0309 13:42:31.000192 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="ceilometer-central-agent" containerID="cri-o://bb420468c16a3231754e16ddfa5b5dad7573dcd0fff5049f6f34e1d807166b01" gracePeriod=30 Mar 09 13:42:31 crc kubenswrapper[4764]: I0309 13:42:31.000408 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="proxy-httpd" containerID="cri-o://bed8beb8204aa72aa2bde54753848668402830e464bf5f8c233fd5f8d9ccc674" gracePeriod=30 Mar 09 13:42:31 crc kubenswrapper[4764]: I0309 13:42:31.000441 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="ceilometer-notification-agent" containerID="cri-o://1c0d0d684bf2cb708c8f8559ea9b7bec1b048fb2e4835892b259f454ffd824ea" gracePeriod=30 Mar 09 13:42:31 crc kubenswrapper[4764]: I0309 13:42:31.015361 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.172:3000/\": EOF" Mar 09 13:42:31 crc kubenswrapper[4764]: I0309 13:42:31.317929 4764 generic.go:334] "Generic (PLEG): container finished" podID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerID="bed8beb8204aa72aa2bde54753848668402830e464bf5f8c233fd5f8d9ccc674" exitCode=0 Mar 09 13:42:31 crc kubenswrapper[4764]: I0309 13:42:31.317973 4764 generic.go:334] "Generic (PLEG): container finished" podID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerID="b214631e329516c4f150fb9c3cadae477dc2a4812e6916faa36fc494a5d835b8" exitCode=2 Mar 09 13:42:31 crc kubenswrapper[4764]: I0309 13:42:31.318004 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e78a4ead-5459-49a9-89f6-5e21ac1baa3c","Type":"ContainerDied","Data":"bed8beb8204aa72aa2bde54753848668402830e464bf5f8c233fd5f8d9ccc674"} Mar 09 13:42:31 crc kubenswrapper[4764]: I0309 13:42:31.318069 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e78a4ead-5459-49a9-89f6-5e21ac1baa3c","Type":"ContainerDied","Data":"b214631e329516c4f150fb9c3cadae477dc2a4812e6916faa36fc494a5d835b8"} Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.342005 4764 generic.go:334] "Generic (PLEG): container finished" podID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerID="1c0d0d684bf2cb708c8f8559ea9b7bec1b048fb2e4835892b259f454ffd824ea" exitCode=0 Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.342421 4764 generic.go:334] "Generic (PLEG): container finished" podID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerID="bb420468c16a3231754e16ddfa5b5dad7573dcd0fff5049f6f34e1d807166b01" exitCode=0 Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.342210 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e78a4ead-5459-49a9-89f6-5e21ac1baa3c","Type":"ContainerDied","Data":"1c0d0d684bf2cb708c8f8559ea9b7bec1b048fb2e4835892b259f454ffd824ea"} Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.342473 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e78a4ead-5459-49a9-89f6-5e21ac1baa3c","Type":"ContainerDied","Data":"bb420468c16a3231754e16ddfa5b5dad7573dcd0fff5049f6f34e1d807166b01"} Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.700340 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.844119 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-scripts\") pod \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.844219 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-sg-core-conf-yaml\") pod \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.844259 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-combined-ca-bundle\") pod \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.844326 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-run-httpd\") pod \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.844402 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-ceilometer-tls-certs\") pod \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.844573 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-log-httpd\") pod \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.844661 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m8ll\" (UniqueName: \"kubernetes.io/projected/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-kube-api-access-4m8ll\") pod \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.844731 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-config-data\") pod \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.845570 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e78a4ead-5459-49a9-89f6-5e21ac1baa3c" (UID: "e78a4ead-5459-49a9-89f6-5e21ac1baa3c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.847432 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e78a4ead-5459-49a9-89f6-5e21ac1baa3c" (UID: "e78a4ead-5459-49a9-89f6-5e21ac1baa3c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.948237 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.948280 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.220240 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-scripts" (OuterVolumeSpecName: "scripts") pod "e78a4ead-5459-49a9-89f6-5e21ac1baa3c" (UID: "e78a4ead-5459-49a9-89f6-5e21ac1baa3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.220271 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-kube-api-access-4m8ll" (OuterVolumeSpecName: "kube-api-access-4m8ll") pod "e78a4ead-5459-49a9-89f6-5e21ac1baa3c" (UID: "e78a4ead-5459-49a9-89f6-5e21ac1baa3c"). InnerVolumeSpecName "kube-api-access-4m8ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.237569 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e78a4ead-5459-49a9-89f6-5e21ac1baa3c" (UID: "e78a4ead-5459-49a9-89f6-5e21ac1baa3c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.258508 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m8ll\" (UniqueName: \"kubernetes.io/projected/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-kube-api-access-4m8ll\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.258551 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.258567 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.259244 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e78a4ead-5459-49a9-89f6-5e21ac1baa3c" (UID: "e78a4ead-5459-49a9-89f6-5e21ac1baa3c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.304903 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e78a4ead-5459-49a9-89f6-5e21ac1baa3c" (UID: "e78a4ead-5459-49a9-89f6-5e21ac1baa3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.305382 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-config-data" (OuterVolumeSpecName: "config-data") pod "e78a4ead-5459-49a9-89f6-5e21ac1baa3c" (UID: "e78a4ead-5459-49a9-89f6-5e21ac1baa3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.357254 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e78a4ead-5459-49a9-89f6-5e21ac1baa3c","Type":"ContainerDied","Data":"79e7c65f033121e5c29021bfaba325c95dd7684d4ddbd0796fab12986b9aed27"} Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.357318 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.357318 4764 scope.go:117] "RemoveContainer" containerID="bed8beb8204aa72aa2bde54753848668402830e464bf5f8c233fd5f8d9ccc674" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.361300 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.361328 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.361342 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.413484 4764 scope.go:117] "RemoveContainer" containerID="b214631e329516c4f150fb9c3cadae477dc2a4812e6916faa36fc494a5d835b8" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.427085 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.449901 4764 scope.go:117] "RemoveContainer" containerID="1c0d0d684bf2cb708c8f8559ea9b7bec1b048fb2e4835892b259f454ffd824ea" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.483167 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.497720 4764 scope.go:117] "RemoveContainer" containerID="bb420468c16a3231754e16ddfa5b5dad7573dcd0fff5049f6f34e1d807166b01" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.510866 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:33 crc kubenswrapper[4764]: E0309 13:42:33.511471 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="ceilometer-central-agent" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.511513 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="ceilometer-central-agent" Mar 09 13:42:33 crc kubenswrapper[4764]: E0309 13:42:33.511526 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="sg-core" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.511533 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="sg-core" Mar 09 13:42:33 crc kubenswrapper[4764]: E0309 13:42:33.511551 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="proxy-httpd" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.511557 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="proxy-httpd" Mar 09 13:42:33 crc kubenswrapper[4764]: E0309 13:42:33.511567 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="ceilometer-notification-agent" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.511572 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="ceilometer-notification-agent" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.511779 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="ceilometer-notification-agent" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.511807 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="sg-core" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.511821 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="proxy-httpd" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.511836 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="ceilometer-central-agent" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.513785 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.516928 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.517839 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.519554 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.525590 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.572301 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" path="/var/lib/kubelet/pods/e78a4ead-5459-49a9-89f6-5e21ac1baa3c/volumes" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.594342 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.594424 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-config-data\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.594449 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.594471 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9xwx\" (UniqueName: \"kubernetes.io/projected/85e5a081-f0bc-457a-a3f0-9f9152f942c3-kube-api-access-k9xwx\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.594494 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.594519 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-run-httpd\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.594543 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-scripts\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.594565 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-log-httpd\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.695505 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-run-httpd\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.695571 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-scripts\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.695593 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-log-httpd\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.695705 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.695758 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-config-data\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.695780 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.695811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9xwx\" (UniqueName: \"kubernetes.io/projected/85e5a081-f0bc-457a-a3f0-9f9152f942c3-kube-api-access-k9xwx\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.695836 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.697445 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-log-httpd\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.697983 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-run-httpd\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.701479 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.701753 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.701846 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.701924 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-scripts\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.718824 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-config-data\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.729460 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9xwx\" (UniqueName: \"kubernetes.io/projected/85e5a081-f0bc-457a-a3f0-9f9152f942c3-kube-api-access-k9xwx\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.841211 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:34 crc kubenswrapper[4764]: I0309 13:42:34.324617 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:34 crc kubenswrapper[4764]: I0309 13:42:34.370023 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85e5a081-f0bc-457a-a3f0-9f9152f942c3","Type":"ContainerStarted","Data":"d2d07148bb1bfe75a3a625df2d14aba687a70d2b7d04873d08b4da416d307e9c"} Mar 09 13:42:35 crc kubenswrapper[4764]: I0309 13:42:35.385946 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85e5a081-f0bc-457a-a3f0-9f9152f942c3","Type":"ContainerStarted","Data":"83e456ab335c632ce0189226c9d2e7f0e9ecfe5a406e7edda24caa25bafb3539"} Mar 09 13:42:36 crc kubenswrapper[4764]: I0309 13:42:36.399415 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85e5a081-f0bc-457a-a3f0-9f9152f942c3","Type":"ContainerStarted","Data":"14dd461ed43101072708519038bb785a1ba517ec81427d8fb63817a3eca45624"} Mar 09 13:42:37 crc kubenswrapper[4764]: I0309 13:42:37.410993 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85e5a081-f0bc-457a-a3f0-9f9152f942c3","Type":"ContainerStarted","Data":"b944ec589887c7cb6534662085f3e5e58d2f1dce03f0aa12b1ed3e01a649c84e"} Mar 09 13:42:37 crc kubenswrapper[4764]: I0309 13:42:37.412941 4764 generic.go:334] "Generic (PLEG): container finished" podID="c0a40476-ff1d-443d-846f-a54cd956aaa3" containerID="01f8a44cba8ccff51deb3a73f62a17d6b208f1f8fb17e65f80892f3b0a90b431" exitCode=0 Mar 09 13:42:37 crc kubenswrapper[4764]: I0309 13:42:37.412997 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kkcml" event={"ID":"c0a40476-ff1d-443d-846f-a54cd956aaa3","Type":"ContainerDied","Data":"01f8a44cba8ccff51deb3a73f62a17d6b208f1f8fb17e65f80892f3b0a90b431"} Mar 09 13:42:38 crc kubenswrapper[4764]: I0309 13:42:38.831462 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.000561 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-config-data\") pod \"c0a40476-ff1d-443d-846f-a54cd956aaa3\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.000843 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-scripts\") pod \"c0a40476-ff1d-443d-846f-a54cd956aaa3\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.000884 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grsz2\" (UniqueName: \"kubernetes.io/projected/c0a40476-ff1d-443d-846f-a54cd956aaa3-kube-api-access-grsz2\") pod \"c0a40476-ff1d-443d-846f-a54cd956aaa3\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.000912 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-combined-ca-bundle\") pod \"c0a40476-ff1d-443d-846f-a54cd956aaa3\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.008503 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0a40476-ff1d-443d-846f-a54cd956aaa3-kube-api-access-grsz2" (OuterVolumeSpecName: "kube-api-access-grsz2") pod "c0a40476-ff1d-443d-846f-a54cd956aaa3" (UID: "c0a40476-ff1d-443d-846f-a54cd956aaa3"). InnerVolumeSpecName "kube-api-access-grsz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.022091 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-scripts" (OuterVolumeSpecName: "scripts") pod "c0a40476-ff1d-443d-846f-a54cd956aaa3" (UID: "c0a40476-ff1d-443d-846f-a54cd956aaa3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.034018 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-config-data" (OuterVolumeSpecName: "config-data") pod "c0a40476-ff1d-443d-846f-a54cd956aaa3" (UID: "c0a40476-ff1d-443d-846f-a54cd956aaa3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.047816 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0a40476-ff1d-443d-846f-a54cd956aaa3" (UID: "c0a40476-ff1d-443d-846f-a54cd956aaa3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.103850 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.103902 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.103912 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.103921 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grsz2\" (UniqueName: \"kubernetes.io/projected/c0a40476-ff1d-443d-846f-a54cd956aaa3-kube-api-access-grsz2\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.436171 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kkcml" event={"ID":"c0a40476-ff1d-443d-846f-a54cd956aaa3","Type":"ContainerDied","Data":"185f2547ce3fbe2885edcb4f82761dd8558e20b523bd1229f310adc3829d04d5"} Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.436713 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="185f2547ce3fbe2885edcb4f82761dd8558e20b523bd1229f310adc3829d04d5" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.436253 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.577676 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 13:42:39 crc kubenswrapper[4764]: E0309 13:42:39.578684 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0a40476-ff1d-443d-846f-a54cd956aaa3" containerName="nova-cell0-conductor-db-sync" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.578713 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a40476-ff1d-443d-846f-a54cd956aaa3" containerName="nova-cell0-conductor-db-sync" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.578982 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0a40476-ff1d-443d-846f-a54cd956aaa3" containerName="nova-cell0-conductor-db-sync" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.579964 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.584963 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-v8mzr" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.585418 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.590692 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.719805 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.719884 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.719978 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xg49\" (UniqueName: \"kubernetes.io/projected/64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9-kube-api-access-2xg49\") pod \"nova-cell0-conductor-0\" (UID: \"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.822535 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.822588 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.822619 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xg49\" (UniqueName: \"kubernetes.io/projected/64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9-kube-api-access-2xg49\") pod \"nova-cell0-conductor-0\" (UID: \"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.827377 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.837540 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.847573 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xg49\" (UniqueName: \"kubernetes.io/projected/64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9-kube-api-access-2xg49\") pod \"nova-cell0-conductor-0\" (UID: \"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.899984 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:40 crc kubenswrapper[4764]: I0309 13:42:40.406316 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 13:42:40 crc kubenswrapper[4764]: W0309 13:42:40.410120 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64bc45ce_7cc3_4d3a_97d7_9e73bfcb4fe9.slice/crio-0c7f63de0584bfaa1552958048369043e4a41b160fc9a684200d3911215a7010 WatchSource:0}: Error finding container 0c7f63de0584bfaa1552958048369043e4a41b160fc9a684200d3911215a7010: Status 404 returned error can't find the container with id 0c7f63de0584bfaa1552958048369043e4a41b160fc9a684200d3911215a7010 Mar 09 13:42:40 crc kubenswrapper[4764]: I0309 13:42:40.461011 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85e5a081-f0bc-457a-a3f0-9f9152f942c3","Type":"ContainerStarted","Data":"6967373dce85bc726e8932709b98fc503ab51918ee1216795cbe6f5192aa2e00"} Mar 09 13:42:40 crc kubenswrapper[4764]: I0309 13:42:40.462659 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:42:40 crc kubenswrapper[4764]: I0309 13:42:40.463249 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9","Type":"ContainerStarted","Data":"0c7f63de0584bfaa1552958048369043e4a41b160fc9a684200d3911215a7010"} Mar 09 13:42:40 crc kubenswrapper[4764]: I0309 13:42:40.493267 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.180573523 podStartE2EDuration="7.493246005s" podCreationTimestamp="2026-03-09 13:42:33 +0000 UTC" firstStartedPulling="2026-03-09 13:42:34.32984102 +0000 UTC m=+1309.580012928" lastFinishedPulling="2026-03-09 13:42:39.642513502 +0000 UTC m=+1314.892685410" observedRunningTime="2026-03-09 13:42:40.483812583 +0000 UTC m=+1315.733984511" watchObservedRunningTime="2026-03-09 13:42:40.493246005 +0000 UTC m=+1315.743417913" Mar 09 13:42:41 crc kubenswrapper[4764]: I0309 13:42:41.484189 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9","Type":"ContainerStarted","Data":"c310a7e59e0c7b01e3eb31eec7e4b0fdfc9a11ffb7a2e9e3c10feffb9eb0b6c4"} Mar 09 13:42:41 crc kubenswrapper[4764]: I0309 13:42:41.484596 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:49 crc kubenswrapper[4764]: I0309 13:42:49.936008 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:49 crc kubenswrapper[4764]: I0309 13:42:49.961002 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=10.960971863 podStartE2EDuration="10.960971863s" podCreationTimestamp="2026-03-09 13:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:42:41.508632075 +0000 UTC m=+1316.758803983" watchObservedRunningTime="2026-03-09 13:42:49.960971863 +0000 UTC m=+1325.211143801" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.591005 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-5k46p"] Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.593031 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.597128 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.597283 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.614253 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5k46p"] Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.695356 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb6mc\" (UniqueName: \"kubernetes.io/projected/9f09c604-028e-4965-aef8-6005ae365be9-kube-api-access-sb6mc\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.695518 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-config-data\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.696042 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-scripts\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.696271 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.785285 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.786985 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.794218 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.797815 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb6mc\" (UniqueName: \"kubernetes.io/projected/9f09c604-028e-4965-aef8-6005ae365be9-kube-api-access-sb6mc\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.797898 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-config-data\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.797967 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-scripts\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.798008 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.807916 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-config-data\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.808522 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-scripts\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.822151 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.845007 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.885320 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb6mc\" (UniqueName: \"kubernetes.io/projected/9f09c604-028e-4965-aef8-6005ae365be9-kube-api-access-sb6mc\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.903105 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cf59ae6-37a7-49a9-846d-e7815a57bda6-logs\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.903153 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkp7s\" (UniqueName: \"kubernetes.io/projected/7cf59ae6-37a7-49a9-846d-e7815a57bda6-kube-api-access-nkp7s\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.903196 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-config-data\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.903266 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.916370 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.949777 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.951423 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.976713 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.986577 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.009377 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h84hj\" (UniqueName: \"kubernetes.io/projected/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-kube-api-access-h84hj\") pod \"nova-scheduler-0\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " pod="openstack/nova-scheduler-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.009449 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cf59ae6-37a7-49a9-846d-e7815a57bda6-logs\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.009471 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkp7s\" (UniqueName: \"kubernetes.io/projected/7cf59ae6-37a7-49a9-846d-e7815a57bda6-kube-api-access-nkp7s\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.009512 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-config-data\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.009538 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-config-data\") pod \"nova-scheduler-0\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " pod="openstack/nova-scheduler-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.009591 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.009657 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " pod="openstack/nova-scheduler-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.010422 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cf59ae6-37a7-49a9-846d-e7815a57bda6-logs\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.035384 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.040535 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-config-data\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.058285 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkp7s\" (UniqueName: \"kubernetes.io/projected/7cf59ae6-37a7-49a9-846d-e7815a57bda6-kube-api-access-nkp7s\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.093673 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.099966 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.106315 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.111430 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h84hj\" (UniqueName: \"kubernetes.io/projected/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-kube-api-access-h84hj\") pod \"nova-scheduler-0\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " pod="openstack/nova-scheduler-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.111781 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-config-data\") pod \"nova-scheduler-0\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " pod="openstack/nova-scheduler-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.111939 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " pod="openstack/nova-scheduler-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.117185 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.126508 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-config-data\") pod \"nova-scheduler-0\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " pod="openstack/nova-scheduler-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.127444 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " pod="openstack/nova-scheduler-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.171291 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h84hj\" (UniqueName: \"kubernetes.io/projected/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-kube-api-access-h84hj\") pod \"nova-scheduler-0\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " pod="openstack/nova-scheduler-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.185752 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.273809 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-config-data\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.273911 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-logs\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.274027 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.274175 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnhnz\" (UniqueName: \"kubernetes.io/projected/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-kube-api-access-nnhnz\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.395386 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnhnz\" (UniqueName: \"kubernetes.io/projected/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-kube-api-access-nnhnz\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.395558 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-config-data\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.395611 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-logs\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.395679 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.404181 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-config-data\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.404777 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.407249 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-logs\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.408306 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.409715 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.422872 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.442200 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.443997 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnhnz\" (UniqueName: \"kubernetes.io/projected/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-kube-api-access-nnhnz\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.474001 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.506757 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.507374 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntmvb\" (UniqueName: \"kubernetes.io/projected/971aa9eb-f331-425d-bf49-d626f3552480-kube-api-access-ntmvb\") pod \"nova-cell1-novncproxy-0\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.507425 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.507604 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.534692 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-gsgp2"] Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.537758 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.556422 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-gsgp2"] Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.610869 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.610972 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntmvb\" (UniqueName: \"kubernetes.io/projected/971aa9eb-f331-425d-bf49-d626f3552480-kube-api-access-ntmvb\") pod \"nova-cell1-novncproxy-0\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.611005 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.611031 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-config\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.611125 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xl8t\" (UniqueName: \"kubernetes.io/projected/dd806e2d-2675-448c-96f5-2440c2e243f2-kube-api-access-4xl8t\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.611168 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-dns-svc\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.611211 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.611279 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.627174 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.627692 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.635808 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntmvb\" (UniqueName: \"kubernetes.io/projected/971aa9eb-f331-425d-bf49-d626f3552480-kube-api-access-ntmvb\") pod \"nova-cell1-novncproxy-0\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.713570 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.713763 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.713822 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-config\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.713916 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xl8t\" (UniqueName: \"kubernetes.io/projected/dd806e2d-2675-448c-96f5-2440c2e243f2-kube-api-access-4xl8t\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.713954 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-dns-svc\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.715228 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-dns-svc\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.715386 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-config\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.716142 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.717034 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.755264 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xl8t\" (UniqueName: \"kubernetes.io/projected/dd806e2d-2675-448c-96f5-2440c2e243f2-kube-api-access-4xl8t\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.759625 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.868876 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.002155 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.058414 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5k46p"] Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.189227 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:42:52 crc kubenswrapper[4764]: W0309 13:42:52.200512 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd0fc8c3_6a60_4629_8b8d_8dd8b471f959.slice/crio-6920f7803183ef0808ee3ca6aca18179666a5ad88d1d672e13be3328cbb65468 WatchSource:0}: Error finding container 6920f7803183ef0808ee3ca6aca18179666a5ad88d1d672e13be3328cbb65468: Status 404 returned error can't find the container with id 6920f7803183ef0808ee3ca6aca18179666a5ad88d1d672e13be3328cbb65468 Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.223783 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nlxjx"] Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.226853 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.237287 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.237343 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.259992 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nlxjx"] Mar 09 13:42:52 crc kubenswrapper[4764]: W0309 13:42:52.333081 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4a57a2f_3a75_4d6a_9fd1_046f26fb32d2.slice/crio-fd0fea3ed4c6e561e840066a42f896e2884b69902ca33275962f893fe59f7641 WatchSource:0}: Error finding container fd0fea3ed4c6e561e840066a42f896e2884b69902ca33275962f893fe59f7641: Status 404 returned error can't find the container with id fd0fea3ed4c6e561e840066a42f896e2884b69902ca33275962f893fe59f7641 Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.342425 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-config-data\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.342528 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-scripts\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.342838 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.342958 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlrlv\" (UniqueName: \"kubernetes.io/projected/b60c99da-3ae5-4340-bcb0-870731679c16-kube-api-access-xlrlv\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.345930 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.424971 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.445037 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-config-data\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.445093 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-scripts\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.445167 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.445201 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlrlv\" (UniqueName: \"kubernetes.io/projected/b60c99da-3ae5-4340-bcb0-870731679c16-kube-api-access-xlrlv\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.452596 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-scripts\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.452800 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.452853 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-config-data\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.460325 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlrlv\" (UniqueName: \"kubernetes.io/projected/b60c99da-3ae5-4340-bcb0-870731679c16-kube-api-access-xlrlv\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.569172 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.594258 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-gsgp2"] Mar 09 13:42:52 crc kubenswrapper[4764]: W0309 13:42:52.602236 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd806e2d_2675_448c_96f5_2440c2e243f2.slice/crio-725b7755875ebd6a6866a71e5813a9d3f7def7b5fe8e24fd807f8fb4b2f4627a WatchSource:0}: Error finding container 725b7755875ebd6a6866a71e5813a9d3f7def7b5fe8e24fd807f8fb4b2f4627a: Status 404 returned error can't find the container with id 725b7755875ebd6a6866a71e5813a9d3f7def7b5fe8e24fd807f8fb4b2f4627a Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.603862 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"971aa9eb-f331-425d-bf49-d626f3552480","Type":"ContainerStarted","Data":"bd5e9cec9ddcd16266d8f6864eddb8fb5fac434d1ee835511cb22e846c128223"} Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.604890 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2","Type":"ContainerStarted","Data":"fd0fea3ed4c6e561e840066a42f896e2884b69902ca33275962f893fe59f7641"} Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.606011 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5k46p" event={"ID":"9f09c604-028e-4965-aef8-6005ae365be9","Type":"ContainerStarted","Data":"d7855c57065ea118e0a7a66a2f52d85558ba845a318ccc1f11c06fba1afae771"} Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.606039 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5k46p" event={"ID":"9f09c604-028e-4965-aef8-6005ae365be9","Type":"ContainerStarted","Data":"1b7c3f7b353ab36332a603793952cde39e961f03595f37d7c599b56a31606280"} Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.609634 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cf59ae6-37a7-49a9-846d-e7815a57bda6","Type":"ContainerStarted","Data":"572746371ca77f76f3f1aebe0923635e0af71fe2c66089e7070ed987afb57c36"} Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.610591 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959","Type":"ContainerStarted","Data":"6920f7803183ef0808ee3ca6aca18179666a5ad88d1d672e13be3328cbb65468"} Mar 09 13:42:53 crc kubenswrapper[4764]: I0309 13:42:53.175202 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-5k46p" podStartSLOduration=3.175179388 podStartE2EDuration="3.175179388s" podCreationTimestamp="2026-03-09 13:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:42:52.628902863 +0000 UTC m=+1327.879074771" watchObservedRunningTime="2026-03-09 13:42:53.175179388 +0000 UTC m=+1328.425351296" Mar 09 13:42:53 crc kubenswrapper[4764]: I0309 13:42:53.176241 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nlxjx"] Mar 09 13:42:53 crc kubenswrapper[4764]: I0309 13:42:53.633105 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd806e2d-2675-448c-96f5-2440c2e243f2" containerID="a48379ed6aa8003713adc5f3726b706529f649e8177a5d07e91f56612288af49" exitCode=0 Mar 09 13:42:53 crc kubenswrapper[4764]: I0309 13:42:53.633307 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" event={"ID":"dd806e2d-2675-448c-96f5-2440c2e243f2","Type":"ContainerDied","Data":"a48379ed6aa8003713adc5f3726b706529f649e8177a5d07e91f56612288af49"} Mar 09 13:42:53 crc kubenswrapper[4764]: I0309 13:42:53.633679 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" event={"ID":"dd806e2d-2675-448c-96f5-2440c2e243f2","Type":"ContainerStarted","Data":"725b7755875ebd6a6866a71e5813a9d3f7def7b5fe8e24fd807f8fb4b2f4627a"} Mar 09 13:42:53 crc kubenswrapper[4764]: I0309 13:42:53.642187 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nlxjx" event={"ID":"b60c99da-3ae5-4340-bcb0-870731679c16","Type":"ContainerStarted","Data":"83d6da43ad98ba05734f9db9d33ae5993158ca7ef62b6b91c4677e94546cef00"} Mar 09 13:42:53 crc kubenswrapper[4764]: I0309 13:42:53.642270 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nlxjx" event={"ID":"b60c99da-3ae5-4340-bcb0-870731679c16","Type":"ContainerStarted","Data":"aaf11202ef0774e446d6c1ce96fa33d0ce40eb52f56f28a40e2337ab3da0d2c0"} Mar 09 13:42:53 crc kubenswrapper[4764]: I0309 13:42:53.729326 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-nlxjx" podStartSLOduration=1.729293192 podStartE2EDuration="1.729293192s" podCreationTimestamp="2026-03-09 13:42:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:42:53.698613663 +0000 UTC m=+1328.948785571" watchObservedRunningTime="2026-03-09 13:42:53.729293192 +0000 UTC m=+1328.979465100" Mar 09 13:42:54 crc kubenswrapper[4764]: I0309 13:42:54.854917 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:42:54 crc kubenswrapper[4764]: I0309 13:42:54.867118 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.690577 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959","Type":"ContainerStarted","Data":"9b4ad77f575f902209b59697f790a142bfdc593a1dbbfea9490ebe9d7bdb60c7"} Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.693046 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"971aa9eb-f331-425d-bf49-d626f3552480","Type":"ContainerStarted","Data":"fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb"} Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.693199 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="971aa9eb-f331-425d-bf49-d626f3552480" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb" gracePeriod=30 Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.696349 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2","Type":"ContainerStarted","Data":"81d1407cfee85c86ab81cdf9ec86e68979d06f5a586c4a54b9f7e6a537e80948"} Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.699361 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" event={"ID":"dd806e2d-2675-448c-96f5-2440c2e243f2","Type":"ContainerStarted","Data":"d735f180db949f6ebbd416610517dc4e197003ab27c3b1031338a802031ce748"} Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.699473 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.707841 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cf59ae6-37a7-49a9-846d-e7815a57bda6","Type":"ContainerStarted","Data":"c32ce8455281ec944af9437b6a70a622f18333dc492467a4ab2c17a62209e49d"} Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.718494 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.752068961 podStartE2EDuration="6.718465179s" podCreationTimestamp="2026-03-09 13:42:50 +0000 UTC" firstStartedPulling="2026-03-09 13:42:52.210895553 +0000 UTC m=+1327.461067461" lastFinishedPulling="2026-03-09 13:42:56.177291771 +0000 UTC m=+1331.427463679" observedRunningTime="2026-03-09 13:42:56.712161131 +0000 UTC m=+1331.962333059" watchObservedRunningTime="2026-03-09 13:42:56.718465179 +0000 UTC m=+1331.968637087" Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.742691 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" podStartSLOduration=5.742669756 podStartE2EDuration="5.742669756s" podCreationTimestamp="2026-03-09 13:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:42:56.738839453 +0000 UTC m=+1331.989011361" watchObservedRunningTime="2026-03-09 13:42:56.742669756 +0000 UTC m=+1331.992841664" Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.760440 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.770119 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.009900846 podStartE2EDuration="5.770097758s" podCreationTimestamp="2026-03-09 13:42:51 +0000 UTC" firstStartedPulling="2026-03-09 13:42:52.422365449 +0000 UTC m=+1327.672537357" lastFinishedPulling="2026-03-09 13:42:56.182562361 +0000 UTC m=+1331.432734269" observedRunningTime="2026-03-09 13:42:56.760331077 +0000 UTC m=+1332.010502995" watchObservedRunningTime="2026-03-09 13:42:56.770097758 +0000 UTC m=+1332.020269656" Mar 09 13:42:57 crc kubenswrapper[4764]: I0309 13:42:57.720415 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2","Type":"ContainerStarted","Data":"eb86255e700a4d6357cce0dc3140544f102a32bacc79bd2992bea6bc6385fe75"} Mar 09 13:42:57 crc kubenswrapper[4764]: I0309 13:42:57.720597 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" containerName="nova-metadata-log" containerID="cri-o://81d1407cfee85c86ab81cdf9ec86e68979d06f5a586c4a54b9f7e6a537e80948" gracePeriod=30 Mar 09 13:42:57 crc kubenswrapper[4764]: I0309 13:42:57.720705 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" containerName="nova-metadata-metadata" containerID="cri-o://eb86255e700a4d6357cce0dc3140544f102a32bacc79bd2992bea6bc6385fe75" gracePeriod=30 Mar 09 13:42:57 crc kubenswrapper[4764]: I0309 13:42:57.724256 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cf59ae6-37a7-49a9-846d-e7815a57bda6","Type":"ContainerStarted","Data":"d605216502245286a1e6a03c9b52f7f75118161d38b5ee337edb5e76def5a35a"} Mar 09 13:42:57 crc kubenswrapper[4764]: I0309 13:42:57.766758 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.925768507 podStartE2EDuration="7.766738726s" podCreationTimestamp="2026-03-09 13:42:50 +0000 UTC" firstStartedPulling="2026-03-09 13:42:52.335708025 +0000 UTC m=+1327.585879933" lastFinishedPulling="2026-03-09 13:42:56.176678244 +0000 UTC m=+1331.426850152" observedRunningTime="2026-03-09 13:42:57.746682671 +0000 UTC m=+1332.996854579" watchObservedRunningTime="2026-03-09 13:42:57.766738726 +0000 UTC m=+1333.016910634" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.736663 4764 generic.go:334] "Generic (PLEG): container finished" podID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" containerID="eb86255e700a4d6357cce0dc3140544f102a32bacc79bd2992bea6bc6385fe75" exitCode=0 Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.738038 4764 generic.go:334] "Generic (PLEG): container finished" podID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" containerID="81d1407cfee85c86ab81cdf9ec86e68979d06f5a586c4a54b9f7e6a537e80948" exitCode=143 Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.736842 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2","Type":"ContainerDied","Data":"eb86255e700a4d6357cce0dc3140544f102a32bacc79bd2992bea6bc6385fe75"} Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.738219 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2","Type":"ContainerDied","Data":"81d1407cfee85c86ab81cdf9ec86e68979d06f5a586c4a54b9f7e6a537e80948"} Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.738237 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2","Type":"ContainerDied","Data":"fd0fea3ed4c6e561e840066a42f896e2884b69902ca33275962f893fe59f7641"} Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.738250 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd0fea3ed4c6e561e840066a42f896e2884b69902ca33275962f893fe59f7641" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.803613 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.837971 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.676291276 podStartE2EDuration="8.837948977s" podCreationTimestamp="2026-03-09 13:42:50 +0000 UTC" firstStartedPulling="2026-03-09 13:42:52.013551814 +0000 UTC m=+1327.263723732" lastFinishedPulling="2026-03-09 13:42:56.175209525 +0000 UTC m=+1331.425381433" observedRunningTime="2026-03-09 13:42:57.779691352 +0000 UTC m=+1333.029863280" watchObservedRunningTime="2026-03-09 13:42:58.837948977 +0000 UTC m=+1334.088120875" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.859520 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnhnz\" (UniqueName: \"kubernetes.io/projected/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-kube-api-access-nnhnz\") pod \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.859624 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-logs\") pod \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.859778 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-config-data\") pod \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.859949 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-combined-ca-bundle\") pod \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.860934 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-logs" (OuterVolumeSpecName: "logs") pod "a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" (UID: "a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.868781 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-kube-api-access-nnhnz" (OuterVolumeSpecName: "kube-api-access-nnhnz") pod "a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" (UID: "a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2"). InnerVolumeSpecName "kube-api-access-nnhnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.899688 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" (UID: "a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.902568 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-config-data" (OuterVolumeSpecName: "config-data") pod "a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" (UID: "a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.962383 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.962436 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.962455 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnhnz\" (UniqueName: \"kubernetes.io/projected/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-kube-api-access-nnhnz\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.962469 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.764679 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.814466 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.841066 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.853112 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:42:59 crc kubenswrapper[4764]: E0309 13:42:59.853924 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" containerName="nova-metadata-metadata" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.854051 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" containerName="nova-metadata-metadata" Mar 09 13:42:59 crc kubenswrapper[4764]: E0309 13:42:59.854213 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" containerName="nova-metadata-log" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.854330 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" containerName="nova-metadata-log" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.854749 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" containerName="nova-metadata-metadata" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.854857 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" containerName="nova-metadata-log" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.856489 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.863394 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.865055 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.869290 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.994434 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.994951 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24nvs\" (UniqueName: \"kubernetes.io/projected/cd9d1249-acf4-4cf5-a350-d4669d003a62-kube-api-access-24nvs\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.994991 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd9d1249-acf4-4cf5-a350-d4669d003a62-logs\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.995046 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.995207 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-config-data\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.097580 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.097698 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24nvs\" (UniqueName: \"kubernetes.io/projected/cd9d1249-acf4-4cf5-a350-d4669d003a62-kube-api-access-24nvs\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.097743 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd9d1249-acf4-4cf5-a350-d4669d003a62-logs\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.097802 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.097835 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-config-data\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.098576 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd9d1249-acf4-4cf5-a350-d4669d003a62-logs\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.102406 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.103061 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-config-data\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.103947 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.123149 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24nvs\" (UniqueName: \"kubernetes.io/projected/cd9d1249-acf4-4cf5-a350-d4669d003a62-kube-api-access-24nvs\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.182408 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.691329 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:00 crc kubenswrapper[4764]: W0309 13:43:00.701992 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd9d1249_acf4_4cf5_a350_d4669d003a62.slice/crio-ed67ba834fc7606ce45fe1f977a16e9be6d0e085710cf0629ecabbea579cc2e4 WatchSource:0}: Error finding container ed67ba834fc7606ce45fe1f977a16e9be6d0e085710cf0629ecabbea579cc2e4: Status 404 returned error can't find the container with id ed67ba834fc7606ce45fe1f977a16e9be6d0e085710cf0629ecabbea579cc2e4 Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.775252 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd9d1249-acf4-4cf5-a350-d4669d003a62","Type":"ContainerStarted","Data":"ed67ba834fc7606ce45fe1f977a16e9be6d0e085710cf0629ecabbea579cc2e4"} Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.114085 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.114621 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.410070 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.412246 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.446187 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.580008 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" path="/var/lib/kubelet/pods/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2/volumes" Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.787745 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd9d1249-acf4-4cf5-a350-d4669d003a62","Type":"ContainerStarted","Data":"f25bf6e9a5305bba01f06fcad8a45aed9a3a13bbe0f4d9055f3324ca8a56f788"} Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.787834 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd9d1249-acf4-4cf5-a350-d4669d003a62","Type":"ContainerStarted","Data":"b3430f3076fe3ab1c512978e7bbb98434aa249b306d65493cf9204eac67f6af3"} Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.806806 4764 generic.go:334] "Generic (PLEG): container finished" podID="b60c99da-3ae5-4340-bcb0-870731679c16" containerID="83d6da43ad98ba05734f9db9d33ae5993158ca7ef62b6b91c4677e94546cef00" exitCode=0 Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.808550 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nlxjx" event={"ID":"b60c99da-3ae5-4340-bcb0-870731679c16","Type":"ContainerDied","Data":"83d6da43ad98ba05734f9db9d33ae5993158ca7ef62b6b91c4677e94546cef00"} Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.820901 4764 generic.go:334] "Generic (PLEG): container finished" podID="9f09c604-028e-4965-aef8-6005ae365be9" containerID="d7855c57065ea118e0a7a66a2f52d85558ba845a318ccc1f11c06fba1afae771" exitCode=0 Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.822057 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5k46p" event={"ID":"9f09c604-028e-4965-aef8-6005ae365be9","Type":"ContainerDied","Data":"d7855c57065ea118e0a7a66a2f52d85558ba845a318ccc1f11c06fba1afae771"} Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.824026 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.82399874 podStartE2EDuration="2.82399874s" podCreationTimestamp="2026-03-09 13:42:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:01.817454516 +0000 UTC m=+1337.067626444" watchObservedRunningTime="2026-03-09 13:43:01.82399874 +0000 UTC m=+1337.074170648" Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.875959 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.884991 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.001175 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-d2v25"] Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.001575 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" podUID="4e886001-842a-4f97-b4c3-d088d80e6a45" containerName="dnsmasq-dns" containerID="cri-o://f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434" gracePeriod=10 Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.202000 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.177:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.202292 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.177:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.596895 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.664352 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-nb\") pod \"4e886001-842a-4f97-b4c3-d088d80e6a45\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.664447 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-dns-svc\") pod \"4e886001-842a-4f97-b4c3-d088d80e6a45\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.664525 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcpch\" (UniqueName: \"kubernetes.io/projected/4e886001-842a-4f97-b4c3-d088d80e6a45-kube-api-access-zcpch\") pod \"4e886001-842a-4f97-b4c3-d088d80e6a45\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.664614 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-config\") pod \"4e886001-842a-4f97-b4c3-d088d80e6a45\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.664853 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-sb\") pod \"4e886001-842a-4f97-b4c3-d088d80e6a45\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.678571 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e886001-842a-4f97-b4c3-d088d80e6a45-kube-api-access-zcpch" (OuterVolumeSpecName: "kube-api-access-zcpch") pod "4e886001-842a-4f97-b4c3-d088d80e6a45" (UID: "4e886001-842a-4f97-b4c3-d088d80e6a45"). InnerVolumeSpecName "kube-api-access-zcpch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.731635 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e886001-842a-4f97-b4c3-d088d80e6a45" (UID: "4e886001-842a-4f97-b4c3-d088d80e6a45"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.768994 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.769219 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcpch\" (UniqueName: \"kubernetes.io/projected/4e886001-842a-4f97-b4c3-d088d80e6a45-kube-api-access-zcpch\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.776888 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4e886001-842a-4f97-b4c3-d088d80e6a45" (UID: "4e886001-842a-4f97-b4c3-d088d80e6a45"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.776920 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4e886001-842a-4f97-b4c3-d088d80e6a45" (UID: "4e886001-842a-4f97-b4c3-d088d80e6a45"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.780265 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-config" (OuterVolumeSpecName: "config") pod "4e886001-842a-4f97-b4c3-d088d80e6a45" (UID: "4e886001-842a-4f97-b4c3-d088d80e6a45"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.834536 4764 generic.go:334] "Generic (PLEG): container finished" podID="4e886001-842a-4f97-b4c3-d088d80e6a45" containerID="f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434" exitCode=0 Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.834633 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.834779 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" event={"ID":"4e886001-842a-4f97-b4c3-d088d80e6a45","Type":"ContainerDied","Data":"f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434"} Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.834890 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" event={"ID":"4e886001-842a-4f97-b4c3-d088d80e6a45","Type":"ContainerDied","Data":"f8318b8e268cec9ccfcf591135ec8e9761aa9bf10f09e2ff5ebd0b76bbd7c843"} Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.834936 4764 scope.go:117] "RemoveContainer" containerID="f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.871345 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.871387 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.871398 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.884453 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-d2v25"] Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.894160 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-d2v25"] Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.898009 4764 scope.go:117] "RemoveContainer" containerID="d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.933978 4764 scope.go:117] "RemoveContainer" containerID="f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434" Mar 09 13:43:02 crc kubenswrapper[4764]: E0309 13:43:02.935326 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434\": container with ID starting with f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434 not found: ID does not exist" containerID="f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.935364 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434"} err="failed to get container status \"f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434\": rpc error: code = NotFound desc = could not find container \"f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434\": container with ID starting with f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434 not found: ID does not exist" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.935388 4764 scope.go:117] "RemoveContainer" containerID="d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01" Mar 09 13:43:02 crc kubenswrapper[4764]: E0309 13:43:02.945323 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01\": container with ID starting with d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01 not found: ID does not exist" containerID="d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.945363 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01"} err="failed to get container status \"d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01\": rpc error: code = NotFound desc = could not find container \"d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01\": container with ID starting with d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01 not found: ID does not exist" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.372591 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.380000 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.493430 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-scripts\") pod \"b60c99da-3ae5-4340-bcb0-870731679c16\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.493605 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-combined-ca-bundle\") pod \"b60c99da-3ae5-4340-bcb0-870731679c16\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.494836 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6mc\" (UniqueName: \"kubernetes.io/projected/9f09c604-028e-4965-aef8-6005ae365be9-kube-api-access-sb6mc\") pod \"9f09c604-028e-4965-aef8-6005ae365be9\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.494952 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-scripts\") pod \"9f09c604-028e-4965-aef8-6005ae365be9\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.494987 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-config-data\") pod \"9f09c604-028e-4965-aef8-6005ae365be9\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.495017 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlrlv\" (UniqueName: \"kubernetes.io/projected/b60c99da-3ae5-4340-bcb0-870731679c16-kube-api-access-xlrlv\") pod \"b60c99da-3ae5-4340-bcb0-870731679c16\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.495044 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-combined-ca-bundle\") pod \"9f09c604-028e-4965-aef8-6005ae365be9\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.495095 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-config-data\") pod \"b60c99da-3ae5-4340-bcb0-870731679c16\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.499346 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-scripts" (OuterVolumeSpecName: "scripts") pod "b60c99da-3ae5-4340-bcb0-870731679c16" (UID: "b60c99da-3ae5-4340-bcb0-870731679c16"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.507451 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f09c604-028e-4965-aef8-6005ae365be9-kube-api-access-sb6mc" (OuterVolumeSpecName: "kube-api-access-sb6mc") pod "9f09c604-028e-4965-aef8-6005ae365be9" (UID: "9f09c604-028e-4965-aef8-6005ae365be9"). InnerVolumeSpecName "kube-api-access-sb6mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.507725 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b60c99da-3ae5-4340-bcb0-870731679c16-kube-api-access-xlrlv" (OuterVolumeSpecName: "kube-api-access-xlrlv") pod "b60c99da-3ae5-4340-bcb0-870731679c16" (UID: "b60c99da-3ae5-4340-bcb0-870731679c16"). InnerVolumeSpecName "kube-api-access-xlrlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.513777 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-scripts" (OuterVolumeSpecName: "scripts") pod "9f09c604-028e-4965-aef8-6005ae365be9" (UID: "9f09c604-028e-4965-aef8-6005ae365be9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.537254 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b60c99da-3ae5-4340-bcb0-870731679c16" (UID: "b60c99da-3ae5-4340-bcb0-870731679c16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.539895 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-config-data" (OuterVolumeSpecName: "config-data") pod "9f09c604-028e-4965-aef8-6005ae365be9" (UID: "9f09c604-028e-4965-aef8-6005ae365be9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.548860 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f09c604-028e-4965-aef8-6005ae365be9" (UID: "9f09c604-028e-4965-aef8-6005ae365be9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.555859 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-config-data" (OuterVolumeSpecName: "config-data") pod "b60c99da-3ae5-4340-bcb0-870731679c16" (UID: "b60c99da-3ae5-4340-bcb0-870731679c16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.571489 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e886001-842a-4f97-b4c3-d088d80e6a45" path="/var/lib/kubelet/pods/4e886001-842a-4f97-b4c3-d088d80e6a45/volumes" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.598517 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6mc\" (UniqueName: \"kubernetes.io/projected/9f09c604-028e-4965-aef8-6005ae365be9-kube-api-access-sb6mc\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.598554 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.598564 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.598601 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlrlv\" (UniqueName: \"kubernetes.io/projected/b60c99da-3ae5-4340-bcb0-870731679c16-kube-api-access-xlrlv\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.598609 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.598618 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.598626 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.598638 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.845829 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nlxjx" event={"ID":"b60c99da-3ae5-4340-bcb0-870731679c16","Type":"ContainerDied","Data":"aaf11202ef0774e446d6c1ce96fa33d0ce40eb52f56f28a40e2337ab3da0d2c0"} Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.845885 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aaf11202ef0774e446d6c1ce96fa33d0ce40eb52f56f28a40e2337ab3da0d2c0" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.845970 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.849032 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5k46p" event={"ID":"9f09c604-028e-4965-aef8-6005ae365be9","Type":"ContainerDied","Data":"1b7c3f7b353ab36332a603793952cde39e961f03595f37d7c599b56a31606280"} Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.849065 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b7c3f7b353ab36332a603793952cde39e961f03595f37d7c599b56a31606280" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.849116 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.863454 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.993489 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 13:43:03 crc kubenswrapper[4764]: E0309 13:43:03.994441 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f09c604-028e-4965-aef8-6005ae365be9" containerName="nova-manage" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.994526 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f09c604-028e-4965-aef8-6005ae365be9" containerName="nova-manage" Mar 09 13:43:03 crc kubenswrapper[4764]: E0309 13:43:03.994623 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e886001-842a-4f97-b4c3-d088d80e6a45" containerName="init" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.994708 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e886001-842a-4f97-b4c3-d088d80e6a45" containerName="init" Mar 09 13:43:03 crc kubenswrapper[4764]: E0309 13:43:03.994811 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e886001-842a-4f97-b4c3-d088d80e6a45" containerName="dnsmasq-dns" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.994893 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e886001-842a-4f97-b4c3-d088d80e6a45" containerName="dnsmasq-dns" Mar 09 13:43:03 crc kubenswrapper[4764]: E0309 13:43:03.994973 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60c99da-3ae5-4340-bcb0-870731679c16" containerName="nova-cell1-conductor-db-sync" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.995051 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60c99da-3ae5-4340-bcb0-870731679c16" containerName="nova-cell1-conductor-db-sync" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.995343 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e886001-842a-4f97-b4c3-d088d80e6a45" containerName="dnsmasq-dns" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.995431 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f09c604-028e-4965-aef8-6005ae365be9" containerName="nova-manage" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.995516 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b60c99da-3ae5-4340-bcb0-870731679c16" containerName="nova-cell1-conductor-db-sync" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.996674 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.001344 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.020998 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.109666 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbl7j\" (UniqueName: \"kubernetes.io/projected/959eb23f-c4b4-4f35-b284-38212848a084-kube-api-access-wbl7j\") pod \"nova-cell1-conductor-0\" (UID: \"959eb23f-c4b4-4f35-b284-38212848a084\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.109734 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/959eb23f-c4b4-4f35-b284-38212848a084-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"959eb23f-c4b4-4f35-b284-38212848a084\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.109819 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/959eb23f-c4b4-4f35-b284-38212848a084-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"959eb23f-c4b4-4f35-b284-38212848a084\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.194767 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.195156 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerName="nova-api-log" containerID="cri-o://c32ce8455281ec944af9437b6a70a622f18333dc492467a4ab2c17a62209e49d" gracePeriod=30 Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.195405 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerName="nova-api-api" containerID="cri-o://d605216502245286a1e6a03c9b52f7f75118161d38b5ee337edb5e76def5a35a" gracePeriod=30 Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.211975 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbl7j\" (UniqueName: \"kubernetes.io/projected/959eb23f-c4b4-4f35-b284-38212848a084-kube-api-access-wbl7j\") pod \"nova-cell1-conductor-0\" (UID: \"959eb23f-c4b4-4f35-b284-38212848a084\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.212058 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/959eb23f-c4b4-4f35-b284-38212848a084-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"959eb23f-c4b4-4f35-b284-38212848a084\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.212135 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/959eb23f-c4b4-4f35-b284-38212848a084-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"959eb23f-c4b4-4f35-b284-38212848a084\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.219553 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/959eb23f-c4b4-4f35-b284-38212848a084-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"959eb23f-c4b4-4f35-b284-38212848a084\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.220791 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.237357 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/959eb23f-c4b4-4f35-b284-38212848a084-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"959eb23f-c4b4-4f35-b284-38212848a084\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.237576 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.237958 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cd9d1249-acf4-4cf5-a350-d4669d003a62" containerName="nova-metadata-log" containerID="cri-o://b3430f3076fe3ab1c512978e7bbb98434aa249b306d65493cf9204eac67f6af3" gracePeriod=30 Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.238235 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cd9d1249-acf4-4cf5-a350-d4669d003a62" containerName="nova-metadata-metadata" containerID="cri-o://f25bf6e9a5305bba01f06fcad8a45aed9a3a13bbe0f4d9055f3324ca8a56f788" gracePeriod=30 Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.242868 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbl7j\" (UniqueName: \"kubernetes.io/projected/959eb23f-c4b4-4f35-b284-38212848a084-kube-api-access-wbl7j\") pod \"nova-cell1-conductor-0\" (UID: \"959eb23f-c4b4-4f35-b284-38212848a084\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.327332 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.361624 4764 scope.go:117] "RemoveContainer" containerID="b34325cd4e2f6811cadb8554b8a6fb24248546f3c4182376f60b7c3268c9f6c7" Mar 09 13:43:04 crc kubenswrapper[4764]: E0309 13:43:04.641176 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd9d1249_acf4_4cf5_a350_d4669d003a62.slice/crio-conmon-f25bf6e9a5305bba01f06fcad8a45aed9a3a13bbe0f4d9055f3324ca8a56f788.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd9d1249_acf4_4cf5_a350_d4669d003a62.slice/crio-conmon-b3430f3076fe3ab1c512978e7bbb98434aa249b306d65493cf9204eac67f6af3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd9d1249_acf4_4cf5_a350_d4669d003a62.slice/crio-b3430f3076fe3ab1c512978e7bbb98434aa249b306d65493cf9204eac67f6af3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd9d1249_acf4_4cf5_a350_d4669d003a62.slice/crio-f25bf6e9a5305bba01f06fcad8a45aed9a3a13bbe0f4d9055f3324ca8a56f788.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.863061 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.869686 4764 generic.go:334] "Generic (PLEG): container finished" podID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerID="c32ce8455281ec944af9437b6a70a622f18333dc492467a4ab2c17a62209e49d" exitCode=143 Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.869794 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cf59ae6-37a7-49a9-846d-e7815a57bda6","Type":"ContainerDied","Data":"c32ce8455281ec944af9437b6a70a622f18333dc492467a4ab2c17a62209e49d"} Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.872518 4764 generic.go:334] "Generic (PLEG): container finished" podID="cd9d1249-acf4-4cf5-a350-d4669d003a62" containerID="f25bf6e9a5305bba01f06fcad8a45aed9a3a13bbe0f4d9055f3324ca8a56f788" exitCode=0 Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.872617 4764 generic.go:334] "Generic (PLEG): container finished" podID="cd9d1249-acf4-4cf5-a350-d4669d003a62" containerID="b3430f3076fe3ab1c512978e7bbb98434aa249b306d65493cf9204eac67f6af3" exitCode=143 Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.872892 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="bd0fc8c3-6a60-4629-8b8d-8dd8b471f959" containerName="nova-scheduler-scheduler" containerID="cri-o://9b4ad77f575f902209b59697f790a142bfdc593a1dbbfea9490ebe9d7bdb60c7" gracePeriod=30 Mar 09 13:43:04 crc kubenswrapper[4764]: W0309 13:43:04.873795 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod959eb23f_c4b4_4f35_b284_38212848a084.slice/crio-7a1dac56a13025152565d05f1934efe1d8bdc00ab66969ded2d836713d40b181 WatchSource:0}: Error finding container 7a1dac56a13025152565d05f1934efe1d8bdc00ab66969ded2d836713d40b181: Status 404 returned error can't find the container with id 7a1dac56a13025152565d05f1934efe1d8bdc00ab66969ded2d836713d40b181 Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.872561 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd9d1249-acf4-4cf5-a350-d4669d003a62","Type":"ContainerDied","Data":"f25bf6e9a5305bba01f06fcad8a45aed9a3a13bbe0f4d9055f3324ca8a56f788"} Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.874081 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd9d1249-acf4-4cf5-a350-d4669d003a62","Type":"ContainerDied","Data":"b3430f3076fe3ab1c512978e7bbb98434aa249b306d65493cf9204eac67f6af3"} Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.893397 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.038843 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-nova-metadata-tls-certs\") pod \"cd9d1249-acf4-4cf5-a350-d4669d003a62\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.038911 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-combined-ca-bundle\") pod \"cd9d1249-acf4-4cf5-a350-d4669d003a62\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.039030 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd9d1249-acf4-4cf5-a350-d4669d003a62-logs\") pod \"cd9d1249-acf4-4cf5-a350-d4669d003a62\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.039235 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24nvs\" (UniqueName: \"kubernetes.io/projected/cd9d1249-acf4-4cf5-a350-d4669d003a62-kube-api-access-24nvs\") pod \"cd9d1249-acf4-4cf5-a350-d4669d003a62\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.039277 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-config-data\") pod \"cd9d1249-acf4-4cf5-a350-d4669d003a62\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.040038 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd9d1249-acf4-4cf5-a350-d4669d003a62-logs" (OuterVolumeSpecName: "logs") pod "cd9d1249-acf4-4cf5-a350-d4669d003a62" (UID: "cd9d1249-acf4-4cf5-a350-d4669d003a62"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.050182 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd9d1249-acf4-4cf5-a350-d4669d003a62-kube-api-access-24nvs" (OuterVolumeSpecName: "kube-api-access-24nvs") pod "cd9d1249-acf4-4cf5-a350-d4669d003a62" (UID: "cd9d1249-acf4-4cf5-a350-d4669d003a62"). InnerVolumeSpecName "kube-api-access-24nvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.089779 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd9d1249-acf4-4cf5-a350-d4669d003a62" (UID: "cd9d1249-acf4-4cf5-a350-d4669d003a62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.092890 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-config-data" (OuterVolumeSpecName: "config-data") pod "cd9d1249-acf4-4cf5-a350-d4669d003a62" (UID: "cd9d1249-acf4-4cf5-a350-d4669d003a62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.114065 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "cd9d1249-acf4-4cf5-a350-d4669d003a62" (UID: "cd9d1249-acf4-4cf5-a350-d4669d003a62"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.142436 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24nvs\" (UniqueName: \"kubernetes.io/projected/cd9d1249-acf4-4cf5-a350-d4669d003a62-kube-api-access-24nvs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.142492 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.142504 4764 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.142513 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.142525 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd9d1249-acf4-4cf5-a350-d4669d003a62-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.885628 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"959eb23f-c4b4-4f35-b284-38212848a084","Type":"ContainerStarted","Data":"77d74bd2b6a4ce5fd56558be984d8ab6788ecfa94ffff85f54a47633040f07ad"} Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.886059 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.886073 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"959eb23f-c4b4-4f35-b284-38212848a084","Type":"ContainerStarted","Data":"7a1dac56a13025152565d05f1934efe1d8bdc00ab66969ded2d836713d40b181"} Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.887826 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd9d1249-acf4-4cf5-a350-d4669d003a62","Type":"ContainerDied","Data":"ed67ba834fc7606ce45fe1f977a16e9be6d0e085710cf0629ecabbea579cc2e4"} Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.887917 4764 scope.go:117] "RemoveContainer" containerID="f25bf6e9a5305bba01f06fcad8a45aed9a3a13bbe0f4d9055f3324ca8a56f788" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.888221 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.910524 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.9105020550000003 podStartE2EDuration="2.910502055s" podCreationTimestamp="2026-03-09 13:43:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:05.908506482 +0000 UTC m=+1341.158678410" watchObservedRunningTime="2026-03-09 13:43:05.910502055 +0000 UTC m=+1341.160673973" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.920021 4764 scope.go:117] "RemoveContainer" containerID="b3430f3076fe3ab1c512978e7bbb98434aa249b306d65493cf9204eac67f6af3" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.936747 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.952129 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.965305 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:05 crc kubenswrapper[4764]: E0309 13:43:05.965871 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd9d1249-acf4-4cf5-a350-d4669d003a62" containerName="nova-metadata-log" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.965893 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd9d1249-acf4-4cf5-a350-d4669d003a62" containerName="nova-metadata-log" Mar 09 13:43:05 crc kubenswrapper[4764]: E0309 13:43:05.965920 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd9d1249-acf4-4cf5-a350-d4669d003a62" containerName="nova-metadata-metadata" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.965930 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd9d1249-acf4-4cf5-a350-d4669d003a62" containerName="nova-metadata-metadata" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.966104 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd9d1249-acf4-4cf5-a350-d4669d003a62" containerName="nova-metadata-metadata" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.966124 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd9d1249-acf4-4cf5-a350-d4669d003a62" containerName="nova-metadata-log" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.967233 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.971466 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.971871 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.976328 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.059545 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aba6bca-21f2-4e18-90e7-098c8541a4f4-logs\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.059685 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.059730 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.060090 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-config-data\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.060483 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6cx7\" (UniqueName: \"kubernetes.io/projected/8aba6bca-21f2-4e18-90e7-098c8541a4f4-kube-api-access-h6cx7\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.162376 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.162485 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-config-data\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.162554 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6cx7\" (UniqueName: \"kubernetes.io/projected/8aba6bca-21f2-4e18-90e7-098c8541a4f4-kube-api-access-h6cx7\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.162577 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aba6bca-21f2-4e18-90e7-098c8541a4f4-logs\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.162621 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.163023 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aba6bca-21f2-4e18-90e7-098c8541a4f4-logs\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.168385 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-config-data\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.171120 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.181113 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.189249 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6cx7\" (UniqueName: \"kubernetes.io/projected/8aba6bca-21f2-4e18-90e7-098c8541a4f4-kube-api-access-h6cx7\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.308763 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: E0309 13:43:06.448008 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9b4ad77f575f902209b59697f790a142bfdc593a1dbbfea9490ebe9d7bdb60c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 13:43:06 crc kubenswrapper[4764]: E0309 13:43:06.463768 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9b4ad77f575f902209b59697f790a142bfdc593a1dbbfea9490ebe9d7bdb60c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 13:43:06 crc kubenswrapper[4764]: E0309 13:43:06.495694 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9b4ad77f575f902209b59697f790a142bfdc593a1dbbfea9490ebe9d7bdb60c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 13:43:06 crc kubenswrapper[4764]: E0309 13:43:06.495801 4764 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bd0fc8c3-6a60-4629-8b8d-8dd8b471f959" containerName="nova-scheduler-scheduler" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.942588 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:07 crc kubenswrapper[4764]: I0309 13:43:07.573905 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd9d1249-acf4-4cf5-a350-d4669d003a62" path="/var/lib/kubelet/pods/cd9d1249-acf4-4cf5-a350-d4669d003a62/volumes" Mar 09 13:43:07 crc kubenswrapper[4764]: I0309 13:43:07.922731 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8aba6bca-21f2-4e18-90e7-098c8541a4f4","Type":"ContainerStarted","Data":"43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016"} Mar 09 13:43:07 crc kubenswrapper[4764]: I0309 13:43:07.922784 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8aba6bca-21f2-4e18-90e7-098c8541a4f4","Type":"ContainerStarted","Data":"ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a"} Mar 09 13:43:07 crc kubenswrapper[4764]: I0309 13:43:07.922797 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8aba6bca-21f2-4e18-90e7-098c8541a4f4","Type":"ContainerStarted","Data":"b10db7fc62dc747c8a3073bba39f8052766584cab6d39aec677be986aaca3d56"} Mar 09 13:43:08 crc kubenswrapper[4764]: I0309 13:43:08.942238 4764 generic.go:334] "Generic (PLEG): container finished" podID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerID="d605216502245286a1e6a03c9b52f7f75118161d38b5ee337edb5e76def5a35a" exitCode=0 Mar 09 13:43:08 crc kubenswrapper[4764]: I0309 13:43:08.942330 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cf59ae6-37a7-49a9-846d-e7815a57bda6","Type":"ContainerDied","Data":"d605216502245286a1e6a03c9b52f7f75118161d38b5ee337edb5e76def5a35a"} Mar 09 13:43:08 crc kubenswrapper[4764]: I0309 13:43:08.954084 4764 generic.go:334] "Generic (PLEG): container finished" podID="bd0fc8c3-6a60-4629-8b8d-8dd8b471f959" containerID="9b4ad77f575f902209b59697f790a142bfdc593a1dbbfea9490ebe9d7bdb60c7" exitCode=0 Mar 09 13:43:08 crc kubenswrapper[4764]: I0309 13:43:08.954215 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959","Type":"ContainerDied","Data":"9b4ad77f575f902209b59697f790a142bfdc593a1dbbfea9490ebe9d7bdb60c7"} Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.066519 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.079067 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.112857 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.112825983 podStartE2EDuration="4.112825983s" podCreationTimestamp="2026-03-09 13:43:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:07.952108694 +0000 UTC m=+1343.202280602" watchObservedRunningTime="2026-03-09 13:43:09.112825983 +0000 UTC m=+1344.362997901" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.133102 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-combined-ca-bundle\") pod \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.133175 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-config-data\") pod \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.133304 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h84hj\" (UniqueName: \"kubernetes.io/projected/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-kube-api-access-h84hj\") pod \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.133332 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cf59ae6-37a7-49a9-846d-e7815a57bda6-logs\") pod \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.133366 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkp7s\" (UniqueName: \"kubernetes.io/projected/7cf59ae6-37a7-49a9-846d-e7815a57bda6-kube-api-access-nkp7s\") pod \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.133392 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-combined-ca-bundle\") pod \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.133422 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-config-data\") pod \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.134418 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cf59ae6-37a7-49a9-846d-e7815a57bda6-logs" (OuterVolumeSpecName: "logs") pod "7cf59ae6-37a7-49a9-846d-e7815a57bda6" (UID: "7cf59ae6-37a7-49a9-846d-e7815a57bda6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.146261 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf59ae6-37a7-49a9-846d-e7815a57bda6-kube-api-access-nkp7s" (OuterVolumeSpecName: "kube-api-access-nkp7s") pod "7cf59ae6-37a7-49a9-846d-e7815a57bda6" (UID: "7cf59ae6-37a7-49a9-846d-e7815a57bda6"). InnerVolumeSpecName "kube-api-access-nkp7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.154119 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-kube-api-access-h84hj" (OuterVolumeSpecName: "kube-api-access-h84hj") pod "bd0fc8c3-6a60-4629-8b8d-8dd8b471f959" (UID: "bd0fc8c3-6a60-4629-8b8d-8dd8b471f959"). InnerVolumeSpecName "kube-api-access-h84hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.165184 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-config-data" (OuterVolumeSpecName: "config-data") pod "7cf59ae6-37a7-49a9-846d-e7815a57bda6" (UID: "7cf59ae6-37a7-49a9-846d-e7815a57bda6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.169004 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd0fc8c3-6a60-4629-8b8d-8dd8b471f959" (UID: "bd0fc8c3-6a60-4629-8b8d-8dd8b471f959"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.169129 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cf59ae6-37a7-49a9-846d-e7815a57bda6" (UID: "7cf59ae6-37a7-49a9-846d-e7815a57bda6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.181510 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-config-data" (OuterVolumeSpecName: "config-data") pod "bd0fc8c3-6a60-4629-8b8d-8dd8b471f959" (UID: "bd0fc8c3-6a60-4629-8b8d-8dd8b471f959"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.236010 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h84hj\" (UniqueName: \"kubernetes.io/projected/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-kube-api-access-h84hj\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.236056 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cf59ae6-37a7-49a9-846d-e7815a57bda6-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.236067 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkp7s\" (UniqueName: \"kubernetes.io/projected/7cf59ae6-37a7-49a9-846d-e7815a57bda6-kube-api-access-nkp7s\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.236081 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.236093 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.236106 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.236117 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.967437 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cf59ae6-37a7-49a9-846d-e7815a57bda6","Type":"ContainerDied","Data":"572746371ca77f76f3f1aebe0923635e0af71fe2c66089e7070ed987afb57c36"} Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.968051 4764 scope.go:117] "RemoveContainer" containerID="d605216502245286a1e6a03c9b52f7f75118161d38b5ee337edb5e76def5a35a" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.967465 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.972136 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959","Type":"ContainerDied","Data":"6920f7803183ef0808ee3ca6aca18179666a5ad88d1d672e13be3328cbb65468"} Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.972310 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.001881 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.015130 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.016686 4764 scope.go:117] "RemoveContainer" containerID="c32ce8455281ec944af9437b6a70a622f18333dc492467a4ab2c17a62209e49d" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.029818 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.041913 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.051827 4764 scope.go:117] "RemoveContainer" containerID="9b4ad77f575f902209b59697f790a142bfdc593a1dbbfea9490ebe9d7bdb60c7" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.056148 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:10 crc kubenswrapper[4764]: E0309 13:43:10.056630 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerName="nova-api-api" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.056661 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerName="nova-api-api" Mar 09 13:43:10 crc kubenswrapper[4764]: E0309 13:43:10.056690 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerName="nova-api-log" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.056697 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerName="nova-api-log" Mar 09 13:43:10 crc kubenswrapper[4764]: E0309 13:43:10.056709 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd0fc8c3-6a60-4629-8b8d-8dd8b471f959" containerName="nova-scheduler-scheduler" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.056716 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd0fc8c3-6a60-4629-8b8d-8dd8b471f959" containerName="nova-scheduler-scheduler" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.056913 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerName="nova-api-log" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.056935 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd0fc8c3-6a60-4629-8b8d-8dd8b471f959" containerName="nova-scheduler-scheduler" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.056948 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerName="nova-api-api" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.058003 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.060788 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.064982 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.067463 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.072110 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.075984 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.084975 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.154913 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtx9b\" (UniqueName: \"kubernetes.io/projected/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-kube-api-access-qtx9b\") pod \"nova-scheduler-0\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.154993 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-logs\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.155573 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.155663 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-config-data\") pod \"nova-scheduler-0\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.155734 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-config-data\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.156033 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfl66\" (UniqueName: \"kubernetes.io/projected/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-kube-api-access-pfl66\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.156145 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.258384 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtx9b\" (UniqueName: \"kubernetes.io/projected/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-kube-api-access-qtx9b\") pod \"nova-scheduler-0\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.258470 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-logs\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.258508 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.258530 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-config-data\") pod \"nova-scheduler-0\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.258556 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-config-data\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.259201 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-logs\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.259335 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfl66\" (UniqueName: \"kubernetes.io/projected/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-kube-api-access-pfl66\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.259412 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.263943 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.264004 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-config-data\") pod \"nova-scheduler-0\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.264567 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-config-data\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.265370 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.278597 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtx9b\" (UniqueName: \"kubernetes.io/projected/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-kube-api-access-qtx9b\") pod \"nova-scheduler-0\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.279710 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfl66\" (UniqueName: \"kubernetes.io/projected/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-kube-api-access-pfl66\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.380915 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.393009 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.901495 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.968774 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:10 crc kubenswrapper[4764]: W0309 13:43:10.969193 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15e658d7_b575_4e5a_a0f2_3d1adcc41cc0.slice/crio-6553270ae008524ac846200ba9108d557920af3b1024ead659efdad57728b5bc WatchSource:0}: Error finding container 6553270ae008524ac846200ba9108d557920af3b1024ead659efdad57728b5bc: Status 404 returned error can't find the container with id 6553270ae008524ac846200ba9108d557920af3b1024ead659efdad57728b5bc Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.989336 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e3d70a54-660f-4ef9-bd2a-ed16699d8d66","Type":"ContainerStarted","Data":"737be67408dde868ad9928c7e4b5b5a92634607014e02a50994bdc6c48b356c6"} Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.990610 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0","Type":"ContainerStarted","Data":"6553270ae008524ac846200ba9108d557920af3b1024ead659efdad57728b5bc"} Mar 09 13:43:11 crc kubenswrapper[4764]: I0309 13:43:11.309540 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 13:43:11 crc kubenswrapper[4764]: I0309 13:43:11.309615 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 13:43:11 crc kubenswrapper[4764]: I0309 13:43:11.572793 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" path="/var/lib/kubelet/pods/7cf59ae6-37a7-49a9-846d-e7815a57bda6/volumes" Mar 09 13:43:11 crc kubenswrapper[4764]: I0309 13:43:11.573406 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd0fc8c3-6a60-4629-8b8d-8dd8b471f959" path="/var/lib/kubelet/pods/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959/volumes" Mar 09 13:43:12 crc kubenswrapper[4764]: I0309 13:43:12.002527 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0","Type":"ContainerStarted","Data":"e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8"} Mar 09 13:43:12 crc kubenswrapper[4764]: I0309 13:43:12.003087 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0","Type":"ContainerStarted","Data":"62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a"} Mar 09 13:43:12 crc kubenswrapper[4764]: I0309 13:43:12.004826 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e3d70a54-660f-4ef9-bd2a-ed16699d8d66","Type":"ContainerStarted","Data":"dc09c0ba380d56f5c82ec457ccdcea9f947bd8b08307a77c8684608027a7c1ac"} Mar 09 13:43:12 crc kubenswrapper[4764]: I0309 13:43:12.044540 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.044503735 podStartE2EDuration="2.044503735s" podCreationTimestamp="2026-03-09 13:43:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:12.024426059 +0000 UTC m=+1347.274597967" watchObservedRunningTime="2026-03-09 13:43:12.044503735 +0000 UTC m=+1347.294675663" Mar 09 13:43:12 crc kubenswrapper[4764]: I0309 13:43:12.050091 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.050070493 podStartE2EDuration="2.050070493s" podCreationTimestamp="2026-03-09 13:43:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:12.045958844 +0000 UTC m=+1347.296130752" watchObservedRunningTime="2026-03-09 13:43:12.050070493 +0000 UTC m=+1347.300242401" Mar 09 13:43:14 crc kubenswrapper[4764]: I0309 13:43:14.365136 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:15 crc kubenswrapper[4764]: I0309 13:43:15.393782 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 09 13:43:16 crc kubenswrapper[4764]: I0309 13:43:16.309417 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 13:43:16 crc kubenswrapper[4764]: I0309 13:43:16.309809 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 13:43:17 crc kubenswrapper[4764]: I0309 13:43:17.323860 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.185:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 13:43:17 crc kubenswrapper[4764]: I0309 13:43:17.323860 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.185:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 13:43:20 crc kubenswrapper[4764]: I0309 13:43:20.381732 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 13:43:20 crc kubenswrapper[4764]: I0309 13:43:20.382228 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 13:43:20 crc kubenswrapper[4764]: I0309 13:43:20.394083 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 09 13:43:20 crc kubenswrapper[4764]: I0309 13:43:20.437876 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 09 13:43:21 crc kubenswrapper[4764]: I0309 13:43:21.132009 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 09 13:43:21 crc kubenswrapper[4764]: I0309 13:43:21.464999 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 13:43:21 crc kubenswrapper[4764]: I0309 13:43:21.465062 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 13:43:26 crc kubenswrapper[4764]: I0309 13:43:26.316748 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 13:43:26 crc kubenswrapper[4764]: I0309 13:43:26.320802 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 13:43:26 crc kubenswrapper[4764]: I0309 13:43:26.326656 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 13:43:26 crc kubenswrapper[4764]: I0309 13:43:26.327346 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.091292 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.128433 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-combined-ca-bundle\") pod \"971aa9eb-f331-425d-bf49-d626f3552480\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.128504 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-config-data\") pod \"971aa9eb-f331-425d-bf49-d626f3552480\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.128716 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntmvb\" (UniqueName: \"kubernetes.io/projected/971aa9eb-f331-425d-bf49-d626f3552480-kube-api-access-ntmvb\") pod \"971aa9eb-f331-425d-bf49-d626f3552480\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.136087 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/971aa9eb-f331-425d-bf49-d626f3552480-kube-api-access-ntmvb" (OuterVolumeSpecName: "kube-api-access-ntmvb") pod "971aa9eb-f331-425d-bf49-d626f3552480" (UID: "971aa9eb-f331-425d-bf49-d626f3552480"). InnerVolumeSpecName "kube-api-access-ntmvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.156635 4764 generic.go:334] "Generic (PLEG): container finished" podID="971aa9eb-f331-425d-bf49-d626f3552480" containerID="fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb" exitCode=137 Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.157964 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.158141 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"971aa9eb-f331-425d-bf49-d626f3552480","Type":"ContainerDied","Data":"fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb"} Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.158185 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"971aa9eb-f331-425d-bf49-d626f3552480","Type":"ContainerDied","Data":"bd5e9cec9ddcd16266d8f6864eddb8fb5fac434d1ee835511cb22e846c128223"} Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.158209 4764 scope.go:117] "RemoveContainer" containerID="fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.162050 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "971aa9eb-f331-425d-bf49-d626f3552480" (UID: "971aa9eb-f331-425d-bf49-d626f3552480"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.162879 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-config-data" (OuterVolumeSpecName: "config-data") pod "971aa9eb-f331-425d-bf49-d626f3552480" (UID: "971aa9eb-f331-425d-bf49-d626f3552480"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.230169 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntmvb\" (UniqueName: \"kubernetes.io/projected/971aa9eb-f331-425d-bf49-d626f3552480-kube-api-access-ntmvb\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.230213 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.230228 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.248129 4764 scope.go:117] "RemoveContainer" containerID="fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb" Mar 09 13:43:27 crc kubenswrapper[4764]: E0309 13:43:27.248809 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb\": container with ID starting with fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb not found: ID does not exist" containerID="fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.248872 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb"} err="failed to get container status \"fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb\": rpc error: code = NotFound desc = could not find container \"fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb\": container with ID starting with fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb not found: ID does not exist" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.497078 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.521800 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.537052 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:43:27 crc kubenswrapper[4764]: E0309 13:43:27.537717 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="971aa9eb-f331-425d-bf49-d626f3552480" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.537746 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="971aa9eb-f331-425d-bf49-d626f3552480" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.537995 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="971aa9eb-f331-425d-bf49-d626f3552480" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.539029 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.543403 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.543677 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.543804 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.547475 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.576844 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="971aa9eb-f331-425d-bf49-d626f3552480" path="/var/lib/kubelet/pods/971aa9eb-f331-425d-bf49-d626f3552480/volumes" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.640270 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.640323 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jknct\" (UniqueName: \"kubernetes.io/projected/6932dd15-578a-4965-bcb9-b506d4e3cd2f-kube-api-access-jknct\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.640357 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.640506 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.640560 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.741832 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.741900 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.741929 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.741954 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jknct\" (UniqueName: \"kubernetes.io/projected/6932dd15-578a-4965-bcb9-b506d4e3cd2f-kube-api-access-jknct\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.741974 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.747046 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.747432 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.747767 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.748274 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.764292 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jknct\" (UniqueName: \"kubernetes.io/projected/6932dd15-578a-4965-bcb9-b506d4e3cd2f-kube-api-access-jknct\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.859405 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:28 crc kubenswrapper[4764]: I0309 13:43:28.155416 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:43:28 crc kubenswrapper[4764]: I0309 13:43:28.370456 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:43:28 crc kubenswrapper[4764]: I0309 13:43:28.370577 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:43:29 crc kubenswrapper[4764]: I0309 13:43:29.186837 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6932dd15-578a-4965-bcb9-b506d4e3cd2f","Type":"ContainerStarted","Data":"0acc8aaa267e07479eac94d51afefd6535810f193ba4dfbfd67a21ef8822e104"} Mar 09 13:43:29 crc kubenswrapper[4764]: I0309 13:43:29.187485 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6932dd15-578a-4965-bcb9-b506d4e3cd2f","Type":"ContainerStarted","Data":"3d495470829f66481099a9f4e12b11b8d834944a8778a8e0089111f310a90d03"} Mar 09 13:43:29 crc kubenswrapper[4764]: I0309 13:43:29.222014 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.221980193 podStartE2EDuration="2.221980193s" podCreationTimestamp="2026-03-09 13:43:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:29.21213971 +0000 UTC m=+1364.462311648" watchObservedRunningTime="2026-03-09 13:43:29.221980193 +0000 UTC m=+1364.472152111" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.389730 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.390311 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.392099 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.392149 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.399195 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.403245 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.643041 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-bwpkr"] Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.644554 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.711308 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-bwpkr"] Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.730062 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-dns-svc\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.730150 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-config\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.730184 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-sb\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.730312 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-nb\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.730354 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jzfc\" (UniqueName: \"kubernetes.io/projected/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-kube-api-access-8jzfc\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.835447 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-nb\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.835538 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jzfc\" (UniqueName: \"kubernetes.io/projected/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-kube-api-access-8jzfc\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.836981 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-nb\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.837548 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-dns-svc\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.835657 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-dns-svc\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.841047 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-config\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.841100 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-sb\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.841940 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-sb\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.842521 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-config\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.871350 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jzfc\" (UniqueName: \"kubernetes.io/projected/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-kube-api-access-8jzfc\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.972052 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:31 crc kubenswrapper[4764]: I0309 13:43:31.575925 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-bwpkr"] Mar 09 13:43:32 crc kubenswrapper[4764]: I0309 13:43:32.237290 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" event={"ID":"4ae25624-74af-4de4-8aa1-14ea5dbc7b68","Type":"ContainerStarted","Data":"3ac89812f31874fa083542c67d0593b58a24f3774ac9cde06937d2e9c1a94aaf"} Mar 09 13:43:32 crc kubenswrapper[4764]: I0309 13:43:32.859919 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.247924 4764 generic.go:334] "Generic (PLEG): container finished" podID="4ae25624-74af-4de4-8aa1-14ea5dbc7b68" containerID="1417bd2805b59981136a1861cd39d3dca99a51d7e7919e8b5c01839da7f91123" exitCode=0 Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.247972 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" event={"ID":"4ae25624-74af-4de4-8aa1-14ea5dbc7b68","Type":"ContainerDied","Data":"1417bd2805b59981136a1861cd39d3dca99a51d7e7919e8b5c01839da7f91123"} Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.519801 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.520721 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="ceilometer-central-agent" containerID="cri-o://83e456ab335c632ce0189226c9d2e7f0e9ecfe5a406e7edda24caa25bafb3539" gracePeriod=30 Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.520804 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="ceilometer-notification-agent" containerID="cri-o://14dd461ed43101072708519038bb785a1ba517ec81427d8fb63817a3eca45624" gracePeriod=30 Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.520804 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="sg-core" containerID="cri-o://b944ec589887c7cb6534662085f3e5e58d2f1dce03f0aa12b1ed3e01a649c84e" gracePeriod=30 Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.520835 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="proxy-httpd" containerID="cri-o://6967373dce85bc726e8932709b98fc503ab51918ee1216795cbe6f5192aa2e00" gracePeriod=30 Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.647424 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.647782 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerName="nova-api-log" containerID="cri-o://62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a" gracePeriod=30 Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.647991 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerName="nova-api-api" containerID="cri-o://e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8" gracePeriod=30 Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.845156 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.174:3000/\": dial tcp 10.217.0.174:3000: connect: connection refused" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.260561 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" event={"ID":"4ae25624-74af-4de4-8aa1-14ea5dbc7b68","Type":"ContainerStarted","Data":"37fa28134b3929cb4f988c4f648598adc00dd97c039b66d6a61153de431009b4"} Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.262186 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.265571 4764 generic.go:334] "Generic (PLEG): container finished" podID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerID="6967373dce85bc726e8932709b98fc503ab51918ee1216795cbe6f5192aa2e00" exitCode=0 Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.265595 4764 generic.go:334] "Generic (PLEG): container finished" podID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerID="b944ec589887c7cb6534662085f3e5e58d2f1dce03f0aa12b1ed3e01a649c84e" exitCode=2 Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.265604 4764 generic.go:334] "Generic (PLEG): container finished" podID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerID="14dd461ed43101072708519038bb785a1ba517ec81427d8fb63817a3eca45624" exitCode=0 Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.265613 4764 generic.go:334] "Generic (PLEG): container finished" podID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerID="83e456ab335c632ce0189226c9d2e7f0e9ecfe5a406e7edda24caa25bafb3539" exitCode=0 Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.265666 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85e5a081-f0bc-457a-a3f0-9f9152f942c3","Type":"ContainerDied","Data":"6967373dce85bc726e8932709b98fc503ab51918ee1216795cbe6f5192aa2e00"} Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.265685 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85e5a081-f0bc-457a-a3f0-9f9152f942c3","Type":"ContainerDied","Data":"b944ec589887c7cb6534662085f3e5e58d2f1dce03f0aa12b1ed3e01a649c84e"} Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.265697 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85e5a081-f0bc-457a-a3f0-9f9152f942c3","Type":"ContainerDied","Data":"14dd461ed43101072708519038bb785a1ba517ec81427d8fb63817a3eca45624"} Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.265707 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85e5a081-f0bc-457a-a3f0-9f9152f942c3","Type":"ContainerDied","Data":"83e456ab335c632ce0189226c9d2e7f0e9ecfe5a406e7edda24caa25bafb3539"} Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.268112 4764 generic.go:334] "Generic (PLEG): container finished" podID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerID="62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a" exitCode=143 Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.268172 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0","Type":"ContainerDied","Data":"62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a"} Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.285754 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" podStartSLOduration=4.2857334080000005 podStartE2EDuration="4.285733408s" podCreationTimestamp="2026-03-09 13:43:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:34.284388193 +0000 UTC m=+1369.534560101" watchObservedRunningTime="2026-03-09 13:43:34.285733408 +0000 UTC m=+1369.535905316" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.451837 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.539486 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-config-data\") pod \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.539675 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-scripts\") pod \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.539733 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-ceilometer-tls-certs\") pod \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.539769 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9xwx\" (UniqueName: \"kubernetes.io/projected/85e5a081-f0bc-457a-a3f0-9f9152f942c3-kube-api-access-k9xwx\") pod \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.539806 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-run-httpd\") pod \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.539850 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-combined-ca-bundle\") pod \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.539876 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-sg-core-conf-yaml\") pod \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.539938 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-log-httpd\") pod \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.540224 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "85e5a081-f0bc-457a-a3f0-9f9152f942c3" (UID: "85e5a081-f0bc-457a-a3f0-9f9152f942c3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.540705 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.541016 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "85e5a081-f0bc-457a-a3f0-9f9152f942c3" (UID: "85e5a081-f0bc-457a-a3f0-9f9152f942c3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.562161 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e5a081-f0bc-457a-a3f0-9f9152f942c3-kube-api-access-k9xwx" (OuterVolumeSpecName: "kube-api-access-k9xwx") pod "85e5a081-f0bc-457a-a3f0-9f9152f942c3" (UID: "85e5a081-f0bc-457a-a3f0-9f9152f942c3"). InnerVolumeSpecName "kube-api-access-k9xwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.570190 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-scripts" (OuterVolumeSpecName: "scripts") pod "85e5a081-f0bc-457a-a3f0-9f9152f942c3" (UID: "85e5a081-f0bc-457a-a3f0-9f9152f942c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.599680 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "85e5a081-f0bc-457a-a3f0-9f9152f942c3" (UID: "85e5a081-f0bc-457a-a3f0-9f9152f942c3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.625919 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "85e5a081-f0bc-457a-a3f0-9f9152f942c3" (UID: "85e5a081-f0bc-457a-a3f0-9f9152f942c3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.643739 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.643779 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.643794 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9xwx\" (UniqueName: \"kubernetes.io/projected/85e5a081-f0bc-457a-a3f0-9f9152f942c3-kube-api-access-k9xwx\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.643804 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.643820 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.660392 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85e5a081-f0bc-457a-a3f0-9f9152f942c3" (UID: "85e5a081-f0bc-457a-a3f0-9f9152f942c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.670223 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-config-data" (OuterVolumeSpecName: "config-data") pod "85e5a081-f0bc-457a-a3f0-9f9152f942c3" (UID: "85e5a081-f0bc-457a-a3f0-9f9152f942c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.745366 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.745404 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.281991 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85e5a081-f0bc-457a-a3f0-9f9152f942c3","Type":"ContainerDied","Data":"d2d07148bb1bfe75a3a625df2d14aba687a70d2b7d04873d08b4da416d307e9c"} Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.282097 4764 scope.go:117] "RemoveContainer" containerID="6967373dce85bc726e8932709b98fc503ab51918ee1216795cbe6f5192aa2e00" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.282155 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.345507 4764 scope.go:117] "RemoveContainer" containerID="b944ec589887c7cb6534662085f3e5e58d2f1dce03f0aa12b1ed3e01a649c84e" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.347622 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.355970 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.376437 4764 scope.go:117] "RemoveContainer" containerID="14dd461ed43101072708519038bb785a1ba517ec81427d8fb63817a3eca45624" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.379722 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:35 crc kubenswrapper[4764]: E0309 13:43:35.380181 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="ceilometer-notification-agent" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.380198 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="ceilometer-notification-agent" Mar 09 13:43:35 crc kubenswrapper[4764]: E0309 13:43:35.380222 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="ceilometer-central-agent" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.380229 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="ceilometer-central-agent" Mar 09 13:43:35 crc kubenswrapper[4764]: E0309 13:43:35.380259 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="proxy-httpd" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.380267 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="proxy-httpd" Mar 09 13:43:35 crc kubenswrapper[4764]: E0309 13:43:35.380279 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="sg-core" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.380288 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="sg-core" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.380461 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="proxy-httpd" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.380480 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="ceilometer-notification-agent" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.380501 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="sg-core" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.380509 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="ceilometer-central-agent" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.383808 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.391476 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.392138 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.392676 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.398057 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.417508 4764 scope.go:117] "RemoveContainer" containerID="83e456ab335c632ce0189226c9d2e7f0e9ecfe5a406e7edda24caa25bafb3539" Mar 09 13:43:35 crc kubenswrapper[4764]: E0309 13:43:35.426128 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85e5a081_f0bc_457a_a3f0_9f9152f942c3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85e5a081_f0bc_457a_a3f0_9f9152f942c3.slice/crio-d2d07148bb1bfe75a3a625df2d14aba687a70d2b7d04873d08b4da416d307e9c\": RecentStats: unable to find data in memory cache]" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.459195 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.459653 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkrrk\" (UniqueName: \"kubernetes.io/projected/2897dc65-e596-414b-b73e-172b0042b6cd-kube-api-access-dkrrk\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.459686 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-config-data\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.459758 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-scripts\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.459837 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.459873 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-run-httpd\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.459890 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.459953 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-log-httpd\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.561480 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.561531 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-run-httpd\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.561579 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-log-httpd\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.561683 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.561738 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkrrk\" (UniqueName: \"kubernetes.io/projected/2897dc65-e596-414b-b73e-172b0042b6cd-kube-api-access-dkrrk\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.561769 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-config-data\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.561842 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-scripts\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.561898 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.563398 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-run-httpd\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.563389 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-log-httpd\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.566084 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.567401 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-config-data\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.569259 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-scripts\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.577562 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" path="/var/lib/kubelet/pods/85e5a081-f0bc-457a-a3f0-9f9152f942c3/volumes" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.583958 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.584409 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.602401 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkrrk\" (UniqueName: \"kubernetes.io/projected/2897dc65-e596-414b-b73e-172b0042b6cd-kube-api-access-dkrrk\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.714474 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.817588 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:36 crc kubenswrapper[4764]: I0309 13:43:36.230348 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:36 crc kubenswrapper[4764]: W0309 13:43:36.234867 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2897dc65_e596_414b_b73e_172b0042b6cd.slice/crio-515481ed93c967d04868f361f4da1be6b976ca58590cd914664e6651afcae34c WatchSource:0}: Error finding container 515481ed93c967d04868f361f4da1be6b976ca58590cd914664e6651afcae34c: Status 404 returned error can't find the container with id 515481ed93c967d04868f361f4da1be6b976ca58590cd914664e6651afcae34c Mar 09 13:43:36 crc kubenswrapper[4764]: I0309 13:43:36.299155 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2897dc65-e596-414b-b73e-172b0042b6cd","Type":"ContainerStarted","Data":"515481ed93c967d04868f361f4da1be6b976ca58590cd914664e6651afcae34c"} Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.219389 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.307221 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-combined-ca-bundle\") pod \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.307305 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfl66\" (UniqueName: \"kubernetes.io/projected/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-kube-api-access-pfl66\") pod \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.307338 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-logs\") pod \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.308448 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-logs" (OuterVolumeSpecName: "logs") pod "15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" (UID: "15e658d7-b575-4e5a-a0f2-3d1adcc41cc0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.312101 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-config-data\") pod \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.314078 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.332981 4764 generic.go:334] "Generic (PLEG): container finished" podID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerID="e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8" exitCode=0 Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.333059 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0","Type":"ContainerDied","Data":"e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8"} Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.333093 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0","Type":"ContainerDied","Data":"6553270ae008524ac846200ba9108d557920af3b1024ead659efdad57728b5bc"} Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.333116 4764 scope.go:117] "RemoveContainer" containerID="e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.333281 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.341264 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-kube-api-access-pfl66" (OuterVolumeSpecName: "kube-api-access-pfl66") pod "15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" (UID: "15e658d7-b575-4e5a-a0f2-3d1adcc41cc0"). InnerVolumeSpecName "kube-api-access-pfl66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.350298 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2897dc65-e596-414b-b73e-172b0042b6cd","Type":"ContainerStarted","Data":"6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148"} Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.359108 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" (UID: "15e658d7-b575-4e5a-a0f2-3d1adcc41cc0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.383093 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-config-data" (OuterVolumeSpecName: "config-data") pod "15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" (UID: "15e658d7-b575-4e5a-a0f2-3d1adcc41cc0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.419108 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.419161 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfl66\" (UniqueName: \"kubernetes.io/projected/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-kube-api-access-pfl66\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.419174 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.455566 4764 scope.go:117] "RemoveContainer" containerID="62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.480146 4764 scope.go:117] "RemoveContainer" containerID="e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8" Mar 09 13:43:37 crc kubenswrapper[4764]: E0309 13:43:37.488179 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8\": container with ID starting with e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8 not found: ID does not exist" containerID="e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.488273 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8"} err="failed to get container status \"e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8\": rpc error: code = NotFound desc = could not find container \"e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8\": container with ID starting with e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8 not found: ID does not exist" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.488315 4764 scope.go:117] "RemoveContainer" containerID="62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a" Mar 09 13:43:37 crc kubenswrapper[4764]: E0309 13:43:37.489150 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a\": container with ID starting with 62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a not found: ID does not exist" containerID="62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.489175 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a"} err="failed to get container status \"62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a\": rpc error: code = NotFound desc = could not find container \"62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a\": container with ID starting with 62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a not found: ID does not exist" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.666566 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.683874 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.690334 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:37 crc kubenswrapper[4764]: E0309 13:43:37.691158 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerName="nova-api-api" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.691178 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerName="nova-api-api" Mar 09 13:43:37 crc kubenswrapper[4764]: E0309 13:43:37.691210 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerName="nova-api-log" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.691217 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerName="nova-api-log" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.691393 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerName="nova-api-log" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.691410 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerName="nova-api-api" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.692361 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.697266 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.697542 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.697736 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.706361 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.833522 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wznv\" (UniqueName: \"kubernetes.io/projected/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-kube-api-access-7wznv\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.833637 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.833719 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-logs\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.833848 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-config-data\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.833881 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.833955 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-public-tls-certs\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.860311 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.882450 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.936501 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wznv\" (UniqueName: \"kubernetes.io/projected/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-kube-api-access-7wznv\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.936626 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.936722 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-logs\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.936929 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-config-data\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.936962 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.937065 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-public-tls-certs\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.937353 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-logs\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.942067 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-public-tls-certs\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.942264 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.943520 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.948126 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-config-data\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.959125 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wznv\" (UniqueName: \"kubernetes.io/projected/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-kube-api-access-7wznv\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.043862 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.374073 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2897dc65-e596-414b-b73e-172b0042b6cd","Type":"ContainerStarted","Data":"b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848"} Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.405514 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.619939 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-q8h4m"] Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.621425 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.626195 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.632200 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.636351 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-q8h4m"] Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.730085 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.776445 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-config-data\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.776546 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.776587 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-scripts\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.776659 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrhwm\" (UniqueName: \"kubernetes.io/projected/d98526d5-8eaa-44a7-a25d-662a4fc8758b-kube-api-access-vrhwm\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.878791 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-config-data\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.878879 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.878916 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-scripts\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.878987 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrhwm\" (UniqueName: \"kubernetes.io/projected/d98526d5-8eaa-44a7-a25d-662a4fc8758b-kube-api-access-vrhwm\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.885891 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-config-data\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.886257 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-scripts\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.886900 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.900202 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrhwm\" (UniqueName: \"kubernetes.io/projected/d98526d5-8eaa-44a7-a25d-662a4fc8758b-kube-api-access-vrhwm\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.957128 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:39 crc kubenswrapper[4764]: I0309 13:43:39.439727 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2897dc65-e596-414b-b73e-172b0042b6cd","Type":"ContainerStarted","Data":"550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9"} Mar 09 13:43:39 crc kubenswrapper[4764]: I0309 13:43:39.441742 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf561cd8-b441-4efe-8f37-9c925d1f7aa9","Type":"ContainerStarted","Data":"8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc"} Mar 09 13:43:39 crc kubenswrapper[4764]: I0309 13:43:39.441843 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf561cd8-b441-4efe-8f37-9c925d1f7aa9","Type":"ContainerStarted","Data":"8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233"} Mar 09 13:43:39 crc kubenswrapper[4764]: I0309 13:43:39.441862 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf561cd8-b441-4efe-8f37-9c925d1f7aa9","Type":"ContainerStarted","Data":"fa48d3e5b6386cde541f520160503cb71e6ebccb4522e2395f1e89b01ebb551e"} Mar 09 13:43:39 crc kubenswrapper[4764]: I0309 13:43:39.471311 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.4712800169999998 podStartE2EDuration="2.471280017s" podCreationTimestamp="2026-03-09 13:43:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:39.463771086 +0000 UTC m=+1374.713943014" watchObservedRunningTime="2026-03-09 13:43:39.471280017 +0000 UTC m=+1374.721451925" Mar 09 13:43:39 crc kubenswrapper[4764]: I0309 13:43:39.502522 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-q8h4m"] Mar 09 13:43:39 crc kubenswrapper[4764]: I0309 13:43:39.573899 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" path="/var/lib/kubelet/pods/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0/volumes" Mar 09 13:43:40 crc kubenswrapper[4764]: I0309 13:43:40.454627 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q8h4m" event={"ID":"d98526d5-8eaa-44a7-a25d-662a4fc8758b","Type":"ContainerStarted","Data":"858a961159a6d1cec6dd22f3429517714ebd1b6a16fd98b28235cb60fc1a94ee"} Mar 09 13:43:40 crc kubenswrapper[4764]: I0309 13:43:40.455154 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q8h4m" event={"ID":"d98526d5-8eaa-44a7-a25d-662a4fc8758b","Type":"ContainerStarted","Data":"0ad74f9523811644a338c930f4e1d9354e736d92d5564d6b6688e12a4bedac66"} Mar 09 13:43:40 crc kubenswrapper[4764]: I0309 13:43:40.479197 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-q8h4m" podStartSLOduration=2.4791733049999998 podStartE2EDuration="2.479173305s" podCreationTimestamp="2026-03-09 13:43:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:40.477564343 +0000 UTC m=+1375.727736251" watchObservedRunningTime="2026-03-09 13:43:40.479173305 +0000 UTC m=+1375.729345213" Mar 09 13:43:40 crc kubenswrapper[4764]: I0309 13:43:40.974090 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.060484 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-gsgp2"] Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.062002 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" podUID="dd806e2d-2675-448c-96f5-2440c2e243f2" containerName="dnsmasq-dns" containerID="cri-o://d735f180db949f6ebbd416610517dc4e197003ab27c3b1031338a802031ce748" gracePeriod=10 Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.469223 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd806e2d-2675-448c-96f5-2440c2e243f2" containerID="d735f180db949f6ebbd416610517dc4e197003ab27c3b1031338a802031ce748" exitCode=0 Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.469303 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" event={"ID":"dd806e2d-2675-448c-96f5-2440c2e243f2","Type":"ContainerDied","Data":"d735f180db949f6ebbd416610517dc4e197003ab27c3b1031338a802031ce748"} Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.473939 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="ceilometer-central-agent" containerID="cri-o://6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148" gracePeriod=30 Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.474114 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2897dc65-e596-414b-b73e-172b0042b6cd","Type":"ContainerStarted","Data":"b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d"} Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.474174 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.474661 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="proxy-httpd" containerID="cri-o://b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d" gracePeriod=30 Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.474715 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="sg-core" containerID="cri-o://550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9" gracePeriod=30 Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.474776 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="ceilometer-notification-agent" containerID="cri-o://b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848" gracePeriod=30 Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.519611 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.019439885 podStartE2EDuration="6.519579023s" podCreationTimestamp="2026-03-09 13:43:35 +0000 UTC" firstStartedPulling="2026-03-09 13:43:36.237448537 +0000 UTC m=+1371.487620445" lastFinishedPulling="2026-03-09 13:43:40.737587665 +0000 UTC m=+1375.987759583" observedRunningTime="2026-03-09 13:43:41.5067273 +0000 UTC m=+1376.756899228" watchObservedRunningTime="2026-03-09 13:43:41.519579023 +0000 UTC m=+1376.769750931" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.571507 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.575965 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-config\") pod \"dd806e2d-2675-448c-96f5-2440c2e243f2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.576433 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-sb\") pod \"dd806e2d-2675-448c-96f5-2440c2e243f2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.576529 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-nb\") pod \"dd806e2d-2675-448c-96f5-2440c2e243f2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.576760 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xl8t\" (UniqueName: \"kubernetes.io/projected/dd806e2d-2675-448c-96f5-2440c2e243f2-kube-api-access-4xl8t\") pod \"dd806e2d-2675-448c-96f5-2440c2e243f2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.576899 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-dns-svc\") pod \"dd806e2d-2675-448c-96f5-2440c2e243f2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.583585 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd806e2d-2675-448c-96f5-2440c2e243f2-kube-api-access-4xl8t" (OuterVolumeSpecName: "kube-api-access-4xl8t") pod "dd806e2d-2675-448c-96f5-2440c2e243f2" (UID: "dd806e2d-2675-448c-96f5-2440c2e243f2"). InnerVolumeSpecName "kube-api-access-4xl8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.659249 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dd806e2d-2675-448c-96f5-2440c2e243f2" (UID: "dd806e2d-2675-448c-96f5-2440c2e243f2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.663187 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dd806e2d-2675-448c-96f5-2440c2e243f2" (UID: "dd806e2d-2675-448c-96f5-2440c2e243f2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.666139 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dd806e2d-2675-448c-96f5-2440c2e243f2" (UID: "dd806e2d-2675-448c-96f5-2440c2e243f2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.668628 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-config" (OuterVolumeSpecName: "config") pod "dd806e2d-2675-448c-96f5-2440c2e243f2" (UID: "dd806e2d-2675-448c-96f5-2440c2e243f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.680727 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.680776 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.680788 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xl8t\" (UniqueName: \"kubernetes.io/projected/dd806e2d-2675-448c-96f5-2440c2e243f2-kube-api-access-4xl8t\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.680804 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.680819 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.491953 4764 generic.go:334] "Generic (PLEG): container finished" podID="2897dc65-e596-414b-b73e-172b0042b6cd" containerID="b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d" exitCode=0 Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.492002 4764 generic.go:334] "Generic (PLEG): container finished" podID="2897dc65-e596-414b-b73e-172b0042b6cd" containerID="550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9" exitCode=2 Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.492009 4764 generic.go:334] "Generic (PLEG): container finished" podID="2897dc65-e596-414b-b73e-172b0042b6cd" containerID="b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848" exitCode=0 Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.493135 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2897dc65-e596-414b-b73e-172b0042b6cd","Type":"ContainerDied","Data":"b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d"} Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.493298 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2897dc65-e596-414b-b73e-172b0042b6cd","Type":"ContainerDied","Data":"550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9"} Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.493426 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2897dc65-e596-414b-b73e-172b0042b6cd","Type":"ContainerDied","Data":"b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848"} Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.497280 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" event={"ID":"dd806e2d-2675-448c-96f5-2440c2e243f2","Type":"ContainerDied","Data":"725b7755875ebd6a6866a71e5813a9d3f7def7b5fe8e24fd807f8fb4b2f4627a"} Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.497362 4764 scope.go:117] "RemoveContainer" containerID="d735f180db949f6ebbd416610517dc4e197003ab27c3b1031338a802031ce748" Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.497587 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.533872 4764 scope.go:117] "RemoveContainer" containerID="a48379ed6aa8003713adc5f3726b706529f649e8177a5d07e91f56612288af49" Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.553303 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-gsgp2"] Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.566337 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-gsgp2"] Mar 09 13:43:43 crc kubenswrapper[4764]: I0309 13:43:43.571795 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd806e2d-2675-448c-96f5-2440c2e243f2" path="/var/lib/kubelet/pods/dd806e2d-2675-448c-96f5-2440c2e243f2/volumes" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.362966 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.439318 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-config-data\") pod \"2897dc65-e596-414b-b73e-172b0042b6cd\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.439800 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-sg-core-conf-yaml\") pod \"2897dc65-e596-414b-b73e-172b0042b6cd\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.439835 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-scripts\") pod \"2897dc65-e596-414b-b73e-172b0042b6cd\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.439878 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-ceilometer-tls-certs\") pod \"2897dc65-e596-414b-b73e-172b0042b6cd\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.439935 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-combined-ca-bundle\") pod \"2897dc65-e596-414b-b73e-172b0042b6cd\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.439983 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-log-httpd\") pod \"2897dc65-e596-414b-b73e-172b0042b6cd\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.440115 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkrrk\" (UniqueName: \"kubernetes.io/projected/2897dc65-e596-414b-b73e-172b0042b6cd-kube-api-access-dkrrk\") pod \"2897dc65-e596-414b-b73e-172b0042b6cd\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.440149 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-run-httpd\") pod \"2897dc65-e596-414b-b73e-172b0042b6cd\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.441562 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2897dc65-e596-414b-b73e-172b0042b6cd" (UID: "2897dc65-e596-414b-b73e-172b0042b6cd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.441920 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2897dc65-e596-414b-b73e-172b0042b6cd" (UID: "2897dc65-e596-414b-b73e-172b0042b6cd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.447571 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2897dc65-e596-414b-b73e-172b0042b6cd-kube-api-access-dkrrk" (OuterVolumeSpecName: "kube-api-access-dkrrk") pod "2897dc65-e596-414b-b73e-172b0042b6cd" (UID: "2897dc65-e596-414b-b73e-172b0042b6cd"). InnerVolumeSpecName "kube-api-access-dkrrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.462222 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-scripts" (OuterVolumeSpecName: "scripts") pod "2897dc65-e596-414b-b73e-172b0042b6cd" (UID: "2897dc65-e596-414b-b73e-172b0042b6cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.482144 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2897dc65-e596-414b-b73e-172b0042b6cd" (UID: "2897dc65-e596-414b-b73e-172b0042b6cd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.518601 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2897dc65-e596-414b-b73e-172b0042b6cd" (UID: "2897dc65-e596-414b-b73e-172b0042b6cd"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.531163 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.530501 4764 generic.go:334] "Generic (PLEG): container finished" podID="2897dc65-e596-414b-b73e-172b0042b6cd" containerID="6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148" exitCode=0 Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.531549 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2897dc65-e596-414b-b73e-172b0042b6cd","Type":"ContainerDied","Data":"6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148"} Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.531594 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2897dc65-e596-414b-b73e-172b0042b6cd","Type":"ContainerDied","Data":"515481ed93c967d04868f361f4da1be6b976ca58590cd914664e6651afcae34c"} Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.531619 4764 scope.go:117] "RemoveContainer" containerID="b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.535851 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2897dc65-e596-414b-b73e-172b0042b6cd" (UID: "2897dc65-e596-414b-b73e-172b0042b6cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.542889 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.542942 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.542956 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkrrk\" (UniqueName: \"kubernetes.io/projected/2897dc65-e596-414b-b73e-172b0042b6cd-kube-api-access-dkrrk\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.542971 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.542983 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.542993 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.543003 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.562602 4764 scope.go:117] "RemoveContainer" containerID="550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.574780 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-config-data" (OuterVolumeSpecName: "config-data") pod "2897dc65-e596-414b-b73e-172b0042b6cd" (UID: "2897dc65-e596-414b-b73e-172b0042b6cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.585308 4764 scope.go:117] "RemoveContainer" containerID="b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.607219 4764 scope.go:117] "RemoveContainer" containerID="6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.642806 4764 scope.go:117] "RemoveContainer" containerID="b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d" Mar 09 13:43:44 crc kubenswrapper[4764]: E0309 13:43:44.643542 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d\": container with ID starting with b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d not found: ID does not exist" containerID="b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.643604 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d"} err="failed to get container status \"b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d\": rpc error: code = NotFound desc = could not find container \"b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d\": container with ID starting with b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d not found: ID does not exist" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.643664 4764 scope.go:117] "RemoveContainer" containerID="550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.644386 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:44 crc kubenswrapper[4764]: E0309 13:43:44.644405 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9\": container with ID starting with 550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9 not found: ID does not exist" containerID="550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.644443 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9"} err="failed to get container status \"550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9\": rpc error: code = NotFound desc = could not find container \"550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9\": container with ID starting with 550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9 not found: ID does not exist" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.644480 4764 scope.go:117] "RemoveContainer" containerID="b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848" Mar 09 13:43:44 crc kubenswrapper[4764]: E0309 13:43:44.645345 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848\": container with ID starting with b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848 not found: ID does not exist" containerID="b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.645387 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848"} err="failed to get container status \"b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848\": rpc error: code = NotFound desc = could not find container \"b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848\": container with ID starting with b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848 not found: ID does not exist" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.645412 4764 scope.go:117] "RemoveContainer" containerID="6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148" Mar 09 13:43:44 crc kubenswrapper[4764]: E0309 13:43:44.645873 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148\": container with ID starting with 6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148 not found: ID does not exist" containerID="6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.645905 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148"} err="failed to get container status \"6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148\": rpc error: code = NotFound desc = could not find container \"6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148\": container with ID starting with 6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148 not found: ID does not exist" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.910700 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.926154 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.939033 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:44 crc kubenswrapper[4764]: E0309 13:43:44.940051 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="sg-core" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940083 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="sg-core" Mar 09 13:43:44 crc kubenswrapper[4764]: E0309 13:43:44.940117 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="ceilometer-central-agent" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940134 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="ceilometer-central-agent" Mar 09 13:43:44 crc kubenswrapper[4764]: E0309 13:43:44.940166 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd806e2d-2675-448c-96f5-2440c2e243f2" containerName="init" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940182 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd806e2d-2675-448c-96f5-2440c2e243f2" containerName="init" Mar 09 13:43:44 crc kubenswrapper[4764]: E0309 13:43:44.940211 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="ceilometer-notification-agent" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940229 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="ceilometer-notification-agent" Mar 09 13:43:44 crc kubenswrapper[4764]: E0309 13:43:44.940269 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd806e2d-2675-448c-96f5-2440c2e243f2" containerName="dnsmasq-dns" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940288 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd806e2d-2675-448c-96f5-2440c2e243f2" containerName="dnsmasq-dns" Mar 09 13:43:44 crc kubenswrapper[4764]: E0309 13:43:44.940314 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="proxy-httpd" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940329 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="proxy-httpd" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940838 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="sg-core" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940880 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="ceilometer-central-agent" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940904 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd806e2d-2675-448c-96f5-2440c2e243f2" containerName="dnsmasq-dns" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940932 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="ceilometer-notification-agent" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940956 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="proxy-httpd" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.945433 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.951513 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.954949 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.955004 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.955156 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.055236 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-config-data\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.055362 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-log-httpd\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.055405 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.055441 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.055491 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-scripts\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.055526 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-run-httpd\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.055574 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxjf8\" (UniqueName: \"kubernetes.io/projected/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-kube-api-access-kxjf8\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.055598 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.157854 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxjf8\" (UniqueName: \"kubernetes.io/projected/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-kube-api-access-kxjf8\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.157922 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.158012 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-config-data\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.158073 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-log-httpd\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.158103 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.158138 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.158182 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-scripts\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.158224 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-run-httpd\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.158856 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-run-httpd\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.158910 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-log-httpd\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.163151 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.163207 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-scripts\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.164594 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-config-data\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.165402 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.173568 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.176636 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxjf8\" (UniqueName: \"kubernetes.io/projected/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-kube-api-access-kxjf8\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.274334 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.580573 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" path="/var/lib/kubelet/pods/2897dc65-e596-414b-b73e-172b0042b6cd/volumes" Mar 09 13:43:45 crc kubenswrapper[4764]: E0309 13:43:45.710289 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd98526d5_8eaa_44a7_a25d_662a4fc8758b.slice/crio-conmon-858a961159a6d1cec6dd22f3429517714ebd1b6a16fd98b28235cb60fc1a94ee.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd98526d5_8eaa_44a7_a25d_662a4fc8758b.slice/crio-858a961159a6d1cec6dd22f3429517714ebd1b6a16fd98b28235cb60fc1a94ee.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.753451 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:46 crc kubenswrapper[4764]: I0309 13:43:46.559132 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7220ea42-daf2-4c41-85c9-0d2bda6d24eb","Type":"ContainerStarted","Data":"ff9bf0b1ebb34e175d58582b99ddb3f1097e979301d7cd95e80e6e75808835d6"} Mar 09 13:43:46 crc kubenswrapper[4764]: I0309 13:43:46.559572 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7220ea42-daf2-4c41-85c9-0d2bda6d24eb","Type":"ContainerStarted","Data":"83415e6b4960e541c9fc0ec3cd4865cce73b704e20a342572bf182a8978c8bc9"} Mar 09 13:43:46 crc kubenswrapper[4764]: I0309 13:43:46.561975 4764 generic.go:334] "Generic (PLEG): container finished" podID="d98526d5-8eaa-44a7-a25d-662a4fc8758b" containerID="858a961159a6d1cec6dd22f3429517714ebd1b6a16fd98b28235cb60fc1a94ee" exitCode=0 Mar 09 13:43:46 crc kubenswrapper[4764]: I0309 13:43:46.562009 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q8h4m" event={"ID":"d98526d5-8eaa-44a7-a25d-662a4fc8758b","Type":"ContainerDied","Data":"858a961159a6d1cec6dd22f3429517714ebd1b6a16fd98b28235cb60fc1a94ee"} Mar 09 13:43:47 crc kubenswrapper[4764]: I0309 13:43:47.592968 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7220ea42-daf2-4c41-85c9-0d2bda6d24eb","Type":"ContainerStarted","Data":"cb0403e3d98f6b11f541bb3ed05951b3aa36eed0ed52f4863be171ce5eaf63e2"} Mar 09 13:43:47 crc kubenswrapper[4764]: I0309 13:43:47.972542 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.044549 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.044608 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.136161 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-scripts\") pod \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.136255 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-combined-ca-bundle\") pod \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.136375 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrhwm\" (UniqueName: \"kubernetes.io/projected/d98526d5-8eaa-44a7-a25d-662a4fc8758b-kube-api-access-vrhwm\") pod \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.136681 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-config-data\") pod \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.144428 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-scripts" (OuterVolumeSpecName: "scripts") pod "d98526d5-8eaa-44a7-a25d-662a4fc8758b" (UID: "d98526d5-8eaa-44a7-a25d-662a4fc8758b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.144739 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d98526d5-8eaa-44a7-a25d-662a4fc8758b-kube-api-access-vrhwm" (OuterVolumeSpecName: "kube-api-access-vrhwm") pod "d98526d5-8eaa-44a7-a25d-662a4fc8758b" (UID: "d98526d5-8eaa-44a7-a25d-662a4fc8758b"). InnerVolumeSpecName "kube-api-access-vrhwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.167517 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-config-data" (OuterVolumeSpecName: "config-data") pod "d98526d5-8eaa-44a7-a25d-662a4fc8758b" (UID: "d98526d5-8eaa-44a7-a25d-662a4fc8758b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.174776 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d98526d5-8eaa-44a7-a25d-662a4fc8758b" (UID: "d98526d5-8eaa-44a7-a25d-662a4fc8758b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.240355 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.240412 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.240426 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.240445 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrhwm\" (UniqueName: \"kubernetes.io/projected/d98526d5-8eaa-44a7-a25d-662a4fc8758b-kube-api-access-vrhwm\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.629693 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7220ea42-daf2-4c41-85c9-0d2bda6d24eb","Type":"ContainerStarted","Data":"c44f22885081c03e82b6817f6378545ad23fd40c3fd48004324856781c5d89d8"} Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.634412 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q8h4m" event={"ID":"d98526d5-8eaa-44a7-a25d-662a4fc8758b","Type":"ContainerDied","Data":"0ad74f9523811644a338c930f4e1d9354e736d92d5564d6b6688e12a4bedac66"} Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.634463 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ad74f9523811644a338c930f4e1d9354e736d92d5564d6b6688e12a4bedac66" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.634532 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.775471 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.776192 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerName="nova-api-log" containerID="cri-o://8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233" gracePeriod=30 Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.776229 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerName="nova-api-api" containerID="cri-o://8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc" gracePeriod=30 Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.786270 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.191:8774/\": EOF" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.790878 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.191:8774/\": EOF" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.798682 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.799077 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e3d70a54-660f-4ef9-bd2a-ed16699d8d66" containerName="nova-scheduler-scheduler" containerID="cri-o://dc09c0ba380d56f5c82ec457ccdcea9f947bd8b08307a77c8684608027a7c1ac" gracePeriod=30 Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.849921 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.850592 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-log" containerID="cri-o://ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a" gracePeriod=30 Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.850875 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-metadata" containerID="cri-o://43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016" gracePeriod=30 Mar 09 13:43:49 crc kubenswrapper[4764]: I0309 13:43:49.655852 4764 generic.go:334] "Generic (PLEG): container finished" podID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerID="ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a" exitCode=143 Mar 09 13:43:49 crc kubenswrapper[4764]: I0309 13:43:49.656467 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8aba6bca-21f2-4e18-90e7-098c8541a4f4","Type":"ContainerDied","Data":"ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a"} Mar 09 13:43:49 crc kubenswrapper[4764]: I0309 13:43:49.672400 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerID="8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233" exitCode=143 Mar 09 13:43:49 crc kubenswrapper[4764]: I0309 13:43:49.672467 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf561cd8-b441-4efe-8f37-9c925d1f7aa9","Type":"ContainerDied","Data":"8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233"} Mar 09 13:43:49 crc kubenswrapper[4764]: I0309 13:43:49.692223 4764 generic.go:334] "Generic (PLEG): container finished" podID="e3d70a54-660f-4ef9-bd2a-ed16699d8d66" containerID="dc09c0ba380d56f5c82ec457ccdcea9f947bd8b08307a77c8684608027a7c1ac" exitCode=0 Mar 09 13:43:49 crc kubenswrapper[4764]: I0309 13:43:49.692269 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e3d70a54-660f-4ef9-bd2a-ed16699d8d66","Type":"ContainerDied","Data":"dc09c0ba380d56f5c82ec457ccdcea9f947bd8b08307a77c8684608027a7c1ac"} Mar 09 13:43:49 crc kubenswrapper[4764]: I0309 13:43:49.871391 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:43:49 crc kubenswrapper[4764]: I0309 13:43:49.992498 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtx9b\" (UniqueName: \"kubernetes.io/projected/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-kube-api-access-qtx9b\") pod \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " Mar 09 13:43:49 crc kubenswrapper[4764]: I0309 13:43:49.993172 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-config-data\") pod \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " Mar 09 13:43:49 crc kubenswrapper[4764]: I0309 13:43:49.993461 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-combined-ca-bundle\") pod \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.003795 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-kube-api-access-qtx9b" (OuterVolumeSpecName: "kube-api-access-qtx9b") pod "e3d70a54-660f-4ef9-bd2a-ed16699d8d66" (UID: "e3d70a54-660f-4ef9-bd2a-ed16699d8d66"). InnerVolumeSpecName "kube-api-access-qtx9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.033823 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3d70a54-660f-4ef9-bd2a-ed16699d8d66" (UID: "e3d70a54-660f-4ef9-bd2a-ed16699d8d66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.036352 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-config-data" (OuterVolumeSpecName: "config-data") pod "e3d70a54-660f-4ef9-bd2a-ed16699d8d66" (UID: "e3d70a54-660f-4ef9-bd2a-ed16699d8d66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.096508 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.096546 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.096560 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtx9b\" (UniqueName: \"kubernetes.io/projected/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-kube-api-access-qtx9b\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.711804 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7220ea42-daf2-4c41-85c9-0d2bda6d24eb","Type":"ContainerStarted","Data":"5ee7ad3de1f9928c6b59a6676d88b6e84ba3c1fa18e74865387fac5e3b0fa60c"} Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.713503 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.718846 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e3d70a54-660f-4ef9-bd2a-ed16699d8d66","Type":"ContainerDied","Data":"737be67408dde868ad9928c7e4b5b5a92634607014e02a50994bdc6c48b356c6"} Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.718899 4764 scope.go:117] "RemoveContainer" containerID="dc09c0ba380d56f5c82ec457ccdcea9f947bd8b08307a77c8684608027a7c1ac" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.719038 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.812080 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.016131235 podStartE2EDuration="6.811946719s" podCreationTimestamp="2026-03-09 13:43:44 +0000 UTC" firstStartedPulling="2026-03-09 13:43:45.761364894 +0000 UTC m=+1381.011536802" lastFinishedPulling="2026-03-09 13:43:49.557180378 +0000 UTC m=+1384.807352286" observedRunningTime="2026-03-09 13:43:50.755615974 +0000 UTC m=+1386.005787882" watchObservedRunningTime="2026-03-09 13:43:50.811946719 +0000 UTC m=+1386.062118637" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.827706 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.847900 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.861530 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:50 crc kubenswrapper[4764]: E0309 13:43:50.862277 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98526d5-8eaa-44a7-a25d-662a4fc8758b" containerName="nova-manage" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.862302 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98526d5-8eaa-44a7-a25d-662a4fc8758b" containerName="nova-manage" Mar 09 13:43:50 crc kubenswrapper[4764]: E0309 13:43:50.862355 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3d70a54-660f-4ef9-bd2a-ed16699d8d66" containerName="nova-scheduler-scheduler" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.862366 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3d70a54-660f-4ef9-bd2a-ed16699d8d66" containerName="nova-scheduler-scheduler" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.862621 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3d70a54-660f-4ef9-bd2a-ed16699d8d66" containerName="nova-scheduler-scheduler" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.862672 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d98526d5-8eaa-44a7-a25d-662a4fc8758b" containerName="nova-manage" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.863804 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.868469 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.887704 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.930412 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d26ba33-e370-4bc8-bb15-b727c0c9c97f-config-data\") pod \"nova-scheduler-0\" (UID: \"7d26ba33-e370-4bc8-bb15-b727c0c9c97f\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.930819 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp9k4\" (UniqueName: \"kubernetes.io/projected/7d26ba33-e370-4bc8-bb15-b727c0c9c97f-kube-api-access-jp9k4\") pod \"nova-scheduler-0\" (UID: \"7d26ba33-e370-4bc8-bb15-b727c0c9c97f\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.931037 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d26ba33-e370-4bc8-bb15-b727c0c9c97f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7d26ba33-e370-4bc8-bb15-b727c0c9c97f\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:51 crc kubenswrapper[4764]: I0309 13:43:51.033508 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp9k4\" (UniqueName: \"kubernetes.io/projected/7d26ba33-e370-4bc8-bb15-b727c0c9c97f-kube-api-access-jp9k4\") pod \"nova-scheduler-0\" (UID: \"7d26ba33-e370-4bc8-bb15-b727c0c9c97f\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:51 crc kubenswrapper[4764]: I0309 13:43:51.033625 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d26ba33-e370-4bc8-bb15-b727c0c9c97f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7d26ba33-e370-4bc8-bb15-b727c0c9c97f\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:51 crc kubenswrapper[4764]: I0309 13:43:51.033755 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d26ba33-e370-4bc8-bb15-b727c0c9c97f-config-data\") pod \"nova-scheduler-0\" (UID: \"7d26ba33-e370-4bc8-bb15-b727c0c9c97f\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:51 crc kubenswrapper[4764]: I0309 13:43:51.041893 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d26ba33-e370-4bc8-bb15-b727c0c9c97f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7d26ba33-e370-4bc8-bb15-b727c0c9c97f\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:51 crc kubenswrapper[4764]: I0309 13:43:51.044036 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d26ba33-e370-4bc8-bb15-b727c0c9c97f-config-data\") pod \"nova-scheduler-0\" (UID: \"7d26ba33-e370-4bc8-bb15-b727c0c9c97f\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:51 crc kubenswrapper[4764]: I0309 13:43:51.054929 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp9k4\" (UniqueName: \"kubernetes.io/projected/7d26ba33-e370-4bc8-bb15-b727c0c9c97f-kube-api-access-jp9k4\") pod \"nova-scheduler-0\" (UID: \"7d26ba33-e370-4bc8-bb15-b727c0c9c97f\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:51 crc kubenswrapper[4764]: I0309 13:43:51.191024 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:43:51 crc kubenswrapper[4764]: I0309 13:43:51.587284 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3d70a54-660f-4ef9-bd2a-ed16699d8d66" path="/var/lib/kubelet/pods/e3d70a54-660f-4ef9-bd2a-ed16699d8d66/volumes" Mar 09 13:43:51 crc kubenswrapper[4764]: I0309 13:43:51.767630 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:51 crc kubenswrapper[4764]: W0309 13:43:51.772276 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d26ba33_e370_4bc8_bb15_b727c0c9c97f.slice/crio-e6917764081c089c5262d61abdf003690e988a0af3f0e46d443245865ac76c2d WatchSource:0}: Error finding container e6917764081c089c5262d61abdf003690e988a0af3f0e46d443245865ac76c2d: Status 404 returned error can't find the container with id e6917764081c089c5262d61abdf003690e988a0af3f0e46d443245865ac76c2d Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.021265 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.185:8775/\": read tcp 10.217.0.2:51062->10.217.0.185:8775: read: connection reset by peer" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.021442 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.185:8775/\": read tcp 10.217.0.2:51066->10.217.0.185:8775: read: connection reset by peer" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.509667 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.674510 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-combined-ca-bundle\") pod \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.674782 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-config-data\") pod \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.674818 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aba6bca-21f2-4e18-90e7-098c8541a4f4-logs\") pod \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.674894 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6cx7\" (UniqueName: \"kubernetes.io/projected/8aba6bca-21f2-4e18-90e7-098c8541a4f4-kube-api-access-h6cx7\") pod \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.674967 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-nova-metadata-tls-certs\") pod \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.677421 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aba6bca-21f2-4e18-90e7-098c8541a4f4-logs" (OuterVolumeSpecName: "logs") pod "8aba6bca-21f2-4e18-90e7-098c8541a4f4" (UID: "8aba6bca-21f2-4e18-90e7-098c8541a4f4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.683165 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aba6bca-21f2-4e18-90e7-098c8541a4f4-kube-api-access-h6cx7" (OuterVolumeSpecName: "kube-api-access-h6cx7") pod "8aba6bca-21f2-4e18-90e7-098c8541a4f4" (UID: "8aba6bca-21f2-4e18-90e7-098c8541a4f4"). InnerVolumeSpecName "kube-api-access-h6cx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.711422 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8aba6bca-21f2-4e18-90e7-098c8541a4f4" (UID: "8aba6bca-21f2-4e18-90e7-098c8541a4f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.711824 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-config-data" (OuterVolumeSpecName: "config-data") pod "8aba6bca-21f2-4e18-90e7-098c8541a4f4" (UID: "8aba6bca-21f2-4e18-90e7-098c8541a4f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.737689 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8aba6bca-21f2-4e18-90e7-098c8541a4f4" (UID: "8aba6bca-21f2-4e18-90e7-098c8541a4f4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.756983 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7d26ba33-e370-4bc8-bb15-b727c0c9c97f","Type":"ContainerStarted","Data":"d1a93c5bd15cd6153ac10b5f15ea2af29110e60ec1c31734a03ea8c5b7b054ec"} Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.757091 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7d26ba33-e370-4bc8-bb15-b727c0c9c97f","Type":"ContainerStarted","Data":"e6917764081c089c5262d61abdf003690e988a0af3f0e46d443245865ac76c2d"} Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.766329 4764 generic.go:334] "Generic (PLEG): container finished" podID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerID="43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016" exitCode=0 Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.767141 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8aba6bca-21f2-4e18-90e7-098c8541a4f4","Type":"ContainerDied","Data":"43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016"} Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.767215 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8aba6bca-21f2-4e18-90e7-098c8541a4f4","Type":"ContainerDied","Data":"b10db7fc62dc747c8a3073bba39f8052766584cab6d39aec677be986aaca3d56"} Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.767236 4764 scope.go:117] "RemoveContainer" containerID="43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.767173 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.788610 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.788661 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.788672 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aba6bca-21f2-4e18-90e7-098c8541a4f4-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.788682 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6cx7\" (UniqueName: \"kubernetes.io/projected/8aba6bca-21f2-4e18-90e7-098c8541a4f4-kube-api-access-h6cx7\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.788693 4764 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.790706 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.7906797279999997 podStartE2EDuration="2.790679728s" podCreationTimestamp="2026-03-09 13:43:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:52.780405604 +0000 UTC m=+1388.030577522" watchObservedRunningTime="2026-03-09 13:43:52.790679728 +0000 UTC m=+1388.040851636" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.854515 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.865899 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.876373 4764 scope.go:117] "RemoveContainer" containerID="ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.892792 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:52 crc kubenswrapper[4764]: E0309 13:43:52.893366 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-metadata" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.893391 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-metadata" Mar 09 13:43:52 crc kubenswrapper[4764]: E0309 13:43:52.893415 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-log" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.893424 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-log" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.893622 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-log" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.893665 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-metadata" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.894845 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.903121 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.903314 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.906045 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.926856 4764 scope.go:117] "RemoveContainer" containerID="43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016" Mar 09 13:43:52 crc kubenswrapper[4764]: E0309 13:43:52.929325 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016\": container with ID starting with 43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016 not found: ID does not exist" containerID="43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.934814 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016"} err="failed to get container status \"43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016\": rpc error: code = NotFound desc = could not find container \"43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016\": container with ID starting with 43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016 not found: ID does not exist" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.934984 4764 scope.go:117] "RemoveContainer" containerID="ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a" Mar 09 13:43:52 crc kubenswrapper[4764]: E0309 13:43:52.941282 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a\": container with ID starting with ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a not found: ID does not exist" containerID="ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.941335 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a"} err="failed to get container status \"ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a\": rpc error: code = NotFound desc = could not find container \"ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a\": container with ID starting with ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a not found: ID does not exist" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.993677 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9226790-b0dc-460b-8c06-127effde8c19-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.993738 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9226790-b0dc-460b-8c06-127effde8c19-logs\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.993816 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9226790-b0dc-460b-8c06-127effde8c19-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.993868 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9226790-b0dc-460b-8c06-127effde8c19-config-data\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.993956 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9pmc\" (UniqueName: \"kubernetes.io/projected/d9226790-b0dc-460b-8c06-127effde8c19-kube-api-access-h9pmc\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.096026 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9226790-b0dc-460b-8c06-127effde8c19-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.096111 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9226790-b0dc-460b-8c06-127effde8c19-config-data\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.096150 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9pmc\" (UniqueName: \"kubernetes.io/projected/d9226790-b0dc-460b-8c06-127effde8c19-kube-api-access-h9pmc\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.097538 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9226790-b0dc-460b-8c06-127effde8c19-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.097600 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9226790-b0dc-460b-8c06-127effde8c19-logs\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.098139 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9226790-b0dc-460b-8c06-127effde8c19-logs\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.101009 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9226790-b0dc-460b-8c06-127effde8c19-config-data\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.101165 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9226790-b0dc-460b-8c06-127effde8c19-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.102182 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9226790-b0dc-460b-8c06-127effde8c19-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.115133 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9pmc\" (UniqueName: \"kubernetes.io/projected/d9226790-b0dc-460b-8c06-127effde8c19-kube-api-access-h9pmc\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.239269 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.575976 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" path="/var/lib/kubelet/pods/8aba6bca-21f2-4e18-90e7-098c8541a4f4/volumes" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.752073 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.790919 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d9226790-b0dc-460b-8c06-127effde8c19","Type":"ContainerStarted","Data":"57b71ff6dbb4dee3477ca54350bb356543e459e128cffe708fcbd88e2d54a9be"} Mar 09 13:43:54 crc kubenswrapper[4764]: I0309 13:43:54.803961 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d9226790-b0dc-460b-8c06-127effde8c19","Type":"ContainerStarted","Data":"fc5d9c90f1b6ba9be2e0c4ec1cfda566e29d87d53d6a66d366c6768440d3a6a6"} Mar 09 13:43:54 crc kubenswrapper[4764]: I0309 13:43:54.804356 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d9226790-b0dc-460b-8c06-127effde8c19","Type":"ContainerStarted","Data":"71a060b0e753606350d29e0d158aaee4cade05ca6590264314aa9f5561d7826a"} Mar 09 13:43:54 crc kubenswrapper[4764]: I0309 13:43:54.826407 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.826388628 podStartE2EDuration="2.826388628s" podCreationTimestamp="2026-03-09 13:43:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:54.822251048 +0000 UTC m=+1390.072422956" watchObservedRunningTime="2026-03-09 13:43:54.826388628 +0000 UTC m=+1390.076560536" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.760381 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.772371 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wznv\" (UniqueName: \"kubernetes.io/projected/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-kube-api-access-7wznv\") pod \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.772449 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-logs\") pod \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.772581 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-combined-ca-bundle\") pod \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.772626 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-public-tls-certs\") pod \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.772767 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-internal-tls-certs\") pod \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.772939 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-config-data\") pod \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.773442 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-logs" (OuterVolumeSpecName: "logs") pod "cf561cd8-b441-4efe-8f37-9c925d1f7aa9" (UID: "cf561cd8-b441-4efe-8f37-9c925d1f7aa9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.773604 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.780504 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-kube-api-access-7wznv" (OuterVolumeSpecName: "kube-api-access-7wznv") pod "cf561cd8-b441-4efe-8f37-9c925d1f7aa9" (UID: "cf561cd8-b441-4efe-8f37-9c925d1f7aa9"). InnerVolumeSpecName "kube-api-access-7wznv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.813982 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf561cd8-b441-4efe-8f37-9c925d1f7aa9" (UID: "cf561cd8-b441-4efe-8f37-9c925d1f7aa9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.838225 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerID="8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc" exitCode=0 Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.838321 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.838387 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf561cd8-b441-4efe-8f37-9c925d1f7aa9","Type":"ContainerDied","Data":"8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc"} Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.838429 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf561cd8-b441-4efe-8f37-9c925d1f7aa9","Type":"ContainerDied","Data":"fa48d3e5b6386cde541f520160503cb71e6ebccb4522e2395f1e89b01ebb551e"} Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.838458 4764 scope.go:117] "RemoveContainer" containerID="8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.859427 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-config-data" (OuterVolumeSpecName: "config-data") pod "cf561cd8-b441-4efe-8f37-9c925d1f7aa9" (UID: "cf561cd8-b441-4efe-8f37-9c925d1f7aa9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.864605 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cf561cd8-b441-4efe-8f37-9c925d1f7aa9" (UID: "cf561cd8-b441-4efe-8f37-9c925d1f7aa9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.883901 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cf561cd8-b441-4efe-8f37-9c925d1f7aa9" (UID: "cf561cd8-b441-4efe-8f37-9c925d1f7aa9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.884234 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-public-tls-certs\") pod \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " Mar 09 13:43:55 crc kubenswrapper[4764]: W0309 13:43:55.884798 4764 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/cf561cd8-b441-4efe-8f37-9c925d1f7aa9/volumes/kubernetes.io~secret/public-tls-certs Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.884875 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cf561cd8-b441-4efe-8f37-9c925d1f7aa9" (UID: "cf561cd8-b441-4efe-8f37-9c925d1f7aa9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.885147 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.885173 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.885184 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.885200 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.885212 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wznv\" (UniqueName: \"kubernetes.io/projected/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-kube-api-access-7wznv\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.924149 4764 scope.go:117] "RemoveContainer" containerID="8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.948209 4764 scope.go:117] "RemoveContainer" containerID="8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc" Mar 09 13:43:55 crc kubenswrapper[4764]: E0309 13:43:55.948877 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc\": container with ID starting with 8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc not found: ID does not exist" containerID="8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.948995 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc"} err="failed to get container status \"8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc\": rpc error: code = NotFound desc = could not find container \"8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc\": container with ID starting with 8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc not found: ID does not exist" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.949089 4764 scope.go:117] "RemoveContainer" containerID="8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233" Mar 09 13:43:55 crc kubenswrapper[4764]: E0309 13:43:55.949602 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233\": container with ID starting with 8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233 not found: ID does not exist" containerID="8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.949774 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233"} err="failed to get container status \"8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233\": rpc error: code = NotFound desc = could not find container \"8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233\": container with ID starting with 8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233 not found: ID does not exist" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.182724 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.191450 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.201487 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.215025 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:56 crc kubenswrapper[4764]: E0309 13:43:56.215698 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerName="nova-api-api" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.215723 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerName="nova-api-api" Mar 09 13:43:56 crc kubenswrapper[4764]: E0309 13:43:56.215773 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerName="nova-api-log" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.215782 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerName="nova-api-log" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.215988 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerName="nova-api-api" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.216038 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerName="nova-api-log" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.217428 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.235292 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.236542 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.238560 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.276932 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.292935 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec790643-05dd-4f21-82f8-ad1586087d85-logs\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.293327 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-public-tls-certs\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.293450 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.293565 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-config-data\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.293708 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.293826 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r27m\" (UniqueName: \"kubernetes.io/projected/ec790643-05dd-4f21-82f8-ad1586087d85-kube-api-access-2r27m\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.396135 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec790643-05dd-4f21-82f8-ad1586087d85-logs\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.396704 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-public-tls-certs\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.396759 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.396779 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-config-data\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.396828 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.396858 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r27m\" (UniqueName: \"kubernetes.io/projected/ec790643-05dd-4f21-82f8-ad1586087d85-kube-api-access-2r27m\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.397136 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec790643-05dd-4f21-82f8-ad1586087d85-logs\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.405153 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-config-data\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.405951 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-public-tls-certs\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.406104 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.406445 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.413284 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r27m\" (UniqueName: \"kubernetes.io/projected/ec790643-05dd-4f21-82f8-ad1586087d85-kube-api-access-2r27m\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.544385 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:57 crc kubenswrapper[4764]: I0309 13:43:57.019926 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:57 crc kubenswrapper[4764]: W0309 13:43:57.025422 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec790643_05dd_4f21_82f8_ad1586087d85.slice/crio-e5febb85c2e4ff2e87ece06adfc53fc61e86aa12892d3775be37ecfec0c5e2b3 WatchSource:0}: Error finding container e5febb85c2e4ff2e87ece06adfc53fc61e86aa12892d3775be37ecfec0c5e2b3: Status 404 returned error can't find the container with id e5febb85c2e4ff2e87ece06adfc53fc61e86aa12892d3775be37ecfec0c5e2b3 Mar 09 13:43:57 crc kubenswrapper[4764]: I0309 13:43:57.572973 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" path="/var/lib/kubelet/pods/cf561cd8-b441-4efe-8f37-9c925d1f7aa9/volumes" Mar 09 13:43:57 crc kubenswrapper[4764]: I0309 13:43:57.923418 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec790643-05dd-4f21-82f8-ad1586087d85","Type":"ContainerStarted","Data":"10414c7ac610cd41780af7da76309dda5c434bf22ccc7520ea9f6e1988bd0034"} Mar 09 13:43:57 crc kubenswrapper[4764]: I0309 13:43:57.923993 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec790643-05dd-4f21-82f8-ad1586087d85","Type":"ContainerStarted","Data":"882c8c96a125a289675d95cbe0f9722a96ed40479aae5418f155662006868bf0"} Mar 09 13:43:57 crc kubenswrapper[4764]: I0309 13:43:57.924017 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec790643-05dd-4f21-82f8-ad1586087d85","Type":"ContainerStarted","Data":"e5febb85c2e4ff2e87ece06adfc53fc61e86aa12892d3775be37ecfec0c5e2b3"} Mar 09 13:43:57 crc kubenswrapper[4764]: I0309 13:43:57.959281 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.959243123 podStartE2EDuration="1.959243123s" podCreationTimestamp="2026-03-09 13:43:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:57.95051862 +0000 UTC m=+1393.200690538" watchObservedRunningTime="2026-03-09 13:43:57.959243123 +0000 UTC m=+1393.209415051" Mar 09 13:43:58 crc kubenswrapper[4764]: I0309 13:43:58.239991 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 13:43:58 crc kubenswrapper[4764]: I0309 13:43:58.240123 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 13:43:58 crc kubenswrapper[4764]: I0309 13:43:58.370852 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:43:58 crc kubenswrapper[4764]: I0309 13:43:58.370954 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.145934 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551064-mbrph"] Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.147896 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551064-mbrph" Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.149887 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.150882 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.151092 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.160139 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551064-mbrph"] Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.290403 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzmqh\" (UniqueName: \"kubernetes.io/projected/77179ff3-861b-4aab-b1b2-db4d12041264-kube-api-access-kzmqh\") pod \"auto-csr-approver-29551064-mbrph\" (UID: \"77179ff3-861b-4aab-b1b2-db4d12041264\") " pod="openshift-infra/auto-csr-approver-29551064-mbrph" Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.393094 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzmqh\" (UniqueName: \"kubernetes.io/projected/77179ff3-861b-4aab-b1b2-db4d12041264-kube-api-access-kzmqh\") pod \"auto-csr-approver-29551064-mbrph\" (UID: \"77179ff3-861b-4aab-b1b2-db4d12041264\") " pod="openshift-infra/auto-csr-approver-29551064-mbrph" Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.414441 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzmqh\" (UniqueName: \"kubernetes.io/projected/77179ff3-861b-4aab-b1b2-db4d12041264-kube-api-access-kzmqh\") pod \"auto-csr-approver-29551064-mbrph\" (UID: \"77179ff3-861b-4aab-b1b2-db4d12041264\") " pod="openshift-infra/auto-csr-approver-29551064-mbrph" Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.475717 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551064-mbrph" Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.960632 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551064-mbrph"] Mar 09 13:44:00 crc kubenswrapper[4764]: W0309 13:44:00.970485 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77179ff3_861b_4aab_b1b2_db4d12041264.slice/crio-b59e2cc9c2be801b32ba93c61672b061be7f18045f91f26eaaff86ba50201fb0 WatchSource:0}: Error finding container b59e2cc9c2be801b32ba93c61672b061be7f18045f91f26eaaff86ba50201fb0: Status 404 returned error can't find the container with id b59e2cc9c2be801b32ba93c61672b061be7f18045f91f26eaaff86ba50201fb0 Mar 09 13:44:01 crc kubenswrapper[4764]: I0309 13:44:01.191374 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 09 13:44:01 crc kubenswrapper[4764]: I0309 13:44:01.226363 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 09 13:44:01 crc kubenswrapper[4764]: I0309 13:44:01.973300 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551064-mbrph" event={"ID":"77179ff3-861b-4aab-b1b2-db4d12041264","Type":"ContainerStarted","Data":"b59e2cc9c2be801b32ba93c61672b061be7f18045f91f26eaaff86ba50201fb0"} Mar 09 13:44:02 crc kubenswrapper[4764]: I0309 13:44:02.001749 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 09 13:44:02 crc kubenswrapper[4764]: I0309 13:44:02.985574 4764 generic.go:334] "Generic (PLEG): container finished" podID="77179ff3-861b-4aab-b1b2-db4d12041264" containerID="6b1dbed6bf6e61f7c12ad4f9dbf8d714fbaa8de7054f1b548bd8d7d1b560e4d6" exitCode=0 Mar 09 13:44:02 crc kubenswrapper[4764]: I0309 13:44:02.985694 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551064-mbrph" event={"ID":"77179ff3-861b-4aab-b1b2-db4d12041264","Type":"ContainerDied","Data":"6b1dbed6bf6e61f7c12ad4f9dbf8d714fbaa8de7054f1b548bd8d7d1b560e4d6"} Mar 09 13:44:03 crc kubenswrapper[4764]: I0309 13:44:03.240055 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 13:44:03 crc kubenswrapper[4764]: I0309 13:44:03.240488 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 13:44:04 crc kubenswrapper[4764]: I0309 13:44:04.252857 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d9226790-b0dc-460b-8c06-127effde8c19" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 13:44:04 crc kubenswrapper[4764]: I0309 13:44:04.252857 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d9226790-b0dc-460b-8c06-127effde8c19" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 13:44:04 crc kubenswrapper[4764]: I0309 13:44:04.363739 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551064-mbrph" Mar 09 13:44:04 crc kubenswrapper[4764]: I0309 13:44:04.497472 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzmqh\" (UniqueName: \"kubernetes.io/projected/77179ff3-861b-4aab-b1b2-db4d12041264-kube-api-access-kzmqh\") pod \"77179ff3-861b-4aab-b1b2-db4d12041264\" (UID: \"77179ff3-861b-4aab-b1b2-db4d12041264\") " Mar 09 13:44:04 crc kubenswrapper[4764]: I0309 13:44:04.511134 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77179ff3-861b-4aab-b1b2-db4d12041264-kube-api-access-kzmqh" (OuterVolumeSpecName: "kube-api-access-kzmqh") pod "77179ff3-861b-4aab-b1b2-db4d12041264" (UID: "77179ff3-861b-4aab-b1b2-db4d12041264"). InnerVolumeSpecName "kube-api-access-kzmqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:04 crc kubenswrapper[4764]: I0309 13:44:04.601253 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzmqh\" (UniqueName: \"kubernetes.io/projected/77179ff3-861b-4aab-b1b2-db4d12041264-kube-api-access-kzmqh\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:05 crc kubenswrapper[4764]: I0309 13:44:05.010772 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551064-mbrph" event={"ID":"77179ff3-861b-4aab-b1b2-db4d12041264","Type":"ContainerDied","Data":"b59e2cc9c2be801b32ba93c61672b061be7f18045f91f26eaaff86ba50201fb0"} Mar 09 13:44:05 crc kubenswrapper[4764]: I0309 13:44:05.010831 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b59e2cc9c2be801b32ba93c61672b061be7f18045f91f26eaaff86ba50201fb0" Mar 09 13:44:05 crc kubenswrapper[4764]: I0309 13:44:05.010915 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551064-mbrph" Mar 09 13:44:05 crc kubenswrapper[4764]: I0309 13:44:05.451465 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551058-6mlbf"] Mar 09 13:44:05 crc kubenswrapper[4764]: I0309 13:44:05.461489 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551058-6mlbf"] Mar 09 13:44:05 crc kubenswrapper[4764]: I0309 13:44:05.582234 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="175910d6-eb27-4000-ac8b-9ea49f05bb8b" path="/var/lib/kubelet/pods/175910d6-eb27-4000-ac8b-9ea49f05bb8b/volumes" Mar 09 13:44:06 crc kubenswrapper[4764]: I0309 13:44:06.545123 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 13:44:06 crc kubenswrapper[4764]: I0309 13:44:06.545203 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 13:44:07 crc kubenswrapper[4764]: I0309 13:44:07.552057 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ec790643-05dd-4f21-82f8-ad1586087d85" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.196:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 13:44:07 crc kubenswrapper[4764]: I0309 13:44:07.560044 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ec790643-05dd-4f21-82f8-ad1586087d85" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.196:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 13:44:13 crc kubenswrapper[4764]: I0309 13:44:13.247550 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 13:44:13 crc kubenswrapper[4764]: I0309 13:44:13.249243 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 13:44:13 crc kubenswrapper[4764]: I0309 13:44:13.259363 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 13:44:14 crc kubenswrapper[4764]: I0309 13:44:14.107878 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 13:44:15 crc kubenswrapper[4764]: I0309 13:44:15.287077 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 13:44:16 crc kubenswrapper[4764]: I0309 13:44:16.553263 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 13:44:16 crc kubenswrapper[4764]: I0309 13:44:16.553700 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 13:44:16 crc kubenswrapper[4764]: I0309 13:44:16.554127 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 13:44:16 crc kubenswrapper[4764]: I0309 13:44:16.554150 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 13:44:16 crc kubenswrapper[4764]: I0309 13:44:16.561267 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 13:44:16 crc kubenswrapper[4764]: I0309 13:44:16.561917 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 13:44:24 crc kubenswrapper[4764]: I0309 13:44:24.015990 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:44:24 crc kubenswrapper[4764]: I0309 13:44:24.939921 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:44:28 crc kubenswrapper[4764]: I0309 13:44:28.370332 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:44:28 crc kubenswrapper[4764]: I0309 13:44:28.371331 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:44:28 crc kubenswrapper[4764]: I0309 13:44:28.371422 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:44:28 crc kubenswrapper[4764]: I0309 13:44:28.372694 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"69810f80271be58962525ba5a5a37ce68d5e172f7c253c440a5a45f3f3beab77"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:44:28 crc kubenswrapper[4764]: I0309 13:44:28.372764 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://69810f80271be58962525ba5a5a37ce68d5e172f7c253c440a5a45f3f3beab77" gracePeriod=600 Mar 09 13:44:29 crc kubenswrapper[4764]: I0309 13:44:29.270955 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="69810f80271be58962525ba5a5a37ce68d5e172f7c253c440a5a45f3f3beab77" exitCode=0 Mar 09 13:44:29 crc kubenswrapper[4764]: I0309 13:44:29.271054 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"69810f80271be58962525ba5a5a37ce68d5e172f7c253c440a5a45f3f3beab77"} Mar 09 13:44:29 crc kubenswrapper[4764]: I0309 13:44:29.271508 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb"} Mar 09 13:44:29 crc kubenswrapper[4764]: I0309 13:44:29.271545 4764 scope.go:117] "RemoveContainer" containerID="089fede4f6de3963d0297d0b784543ed7a9d38cdefe66f6bbb9aab989dbffbb0" Mar 09 13:44:29 crc kubenswrapper[4764]: I0309 13:44:29.672306 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="507bcef1-e9ef-4eb1-85ce-358209b944bc" containerName="rabbitmq" containerID="cri-o://bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828" gracePeriod=604795 Mar 09 13:44:29 crc kubenswrapper[4764]: I0309 13:44:29.795792 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="11c8bf9f-a031-4e56-b1d7-49b407eabaf7" containerName="rabbitmq" containerID="cri-o://c051454f6de8558a84ceb371da4edc9fd851c2e829bf18eb804c0c4e657ef0ee" gracePeriod=604796 Mar 09 13:44:34 crc kubenswrapper[4764]: I0309 13:44:34.829402 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="11c8bf9f-a031-4e56-b1d7-49b407eabaf7" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Mar 09 13:44:34 crc kubenswrapper[4764]: I0309 13:44:34.912708 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="507bcef1-e9ef-4eb1-85ce-358209b944bc" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.286078 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353249 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/507bcef1-e9ef-4eb1-85ce-358209b944bc-pod-info\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353350 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w75z\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-kube-api-access-2w75z\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353391 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-tls\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353471 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-config-data\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353509 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-erlang-cookie\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353569 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-server-conf\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353627 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-plugins\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353754 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353783 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-plugins-conf\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353871 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-confd\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353921 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/507bcef1-e9ef-4eb1-85ce-358209b944bc-erlang-cookie-secret\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.354419 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.354891 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.355541 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.355703 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.363715 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/507bcef1-e9ef-4eb1-85ce-358209b944bc-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.366229 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/507bcef1-e9ef-4eb1-85ce-358209b944bc-pod-info" (OuterVolumeSpecName: "pod-info") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.367123 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-kube-api-access-2w75z" (OuterVolumeSpecName: "kube-api-access-2w75z") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "kube-api-access-2w75z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.366049 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.373425 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.381256 4764 generic.go:334] "Generic (PLEG): container finished" podID="507bcef1-e9ef-4eb1-85ce-358209b944bc" containerID="bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828" exitCode=0 Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.383697 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"507bcef1-e9ef-4eb1-85ce-358209b944bc","Type":"ContainerDied","Data":"bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828"} Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.383992 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"507bcef1-e9ef-4eb1-85ce-358209b944bc","Type":"ContainerDied","Data":"ee9809e2cf751402688e9f6828a75759ba83ac17c29d13b65aa1aa2a2afdc207"} Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.384190 4764 scope.go:117] "RemoveContainer" containerID="bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.384700 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.390964 4764 generic.go:334] "Generic (PLEG): container finished" podID="11c8bf9f-a031-4e56-b1d7-49b407eabaf7" containerID="c051454f6de8558a84ceb371da4edc9fd851c2e829bf18eb804c0c4e657ef0ee" exitCode=0 Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.391029 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"11c8bf9f-a031-4e56-b1d7-49b407eabaf7","Type":"ContainerDied","Data":"c051454f6de8558a84ceb371da4edc9fd851c2e829bf18eb804c0c4e657ef0ee"} Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.408366 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-config-data" (OuterVolumeSpecName: "config-data") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.466179 4764 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/507bcef1-e9ef-4eb1-85ce-358209b944bc-pod-info\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.466229 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w75z\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-kube-api-access-2w75z\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.466244 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.466257 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.466272 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.466303 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.466311 4764 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.466321 4764 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/507bcef1-e9ef-4eb1-85ce-358209b944bc-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.475261 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.475992 4764 scope.go:117] "RemoveContainer" containerID="f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.515611 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.522626 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-server-conf" (OuterVolumeSpecName: "server-conf") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.537699 4764 scope.go:117] "RemoveContainer" containerID="bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828" Mar 09 13:44:36 crc kubenswrapper[4764]: E0309 13:44:36.543161 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828\": container with ID starting with bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828 not found: ID does not exist" containerID="bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.543231 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828"} err="failed to get container status \"bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828\": rpc error: code = NotFound desc = could not find container \"bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828\": container with ID starting with bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828 not found: ID does not exist" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.543265 4764 scope.go:117] "RemoveContainer" containerID="f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601" Mar 09 13:44:36 crc kubenswrapper[4764]: E0309 13:44:36.551379 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601\": container with ID starting with f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601 not found: ID does not exist" containerID="f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.551470 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601"} err="failed to get container status \"f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601\": rpc error: code = NotFound desc = could not find container \"f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601\": container with ID starting with f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601 not found: ID does not exist" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.567833 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.568004 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-plugins\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.568088 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-erlang-cookie\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.568170 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-tls\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.568271 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-confd\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.568304 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-plugins-conf\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.568368 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-config-data\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.568464 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-erlang-cookie-secret\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.568505 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-pod-info\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.568561 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-server-conf\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.568586 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bzh8\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-kube-api-access-4bzh8\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.569495 4764 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-server-conf\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.569518 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.570466 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.570718 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.571466 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.573496 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.573935 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-kube-api-access-4bzh8" (OuterVolumeSpecName: "kube-api-access-4bzh8") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "kube-api-access-4bzh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.576340 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.578761 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-pod-info" (OuterVolumeSpecName: "pod-info") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.579910 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.581311 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.622064 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-config-data" (OuterVolumeSpecName: "config-data") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.656068 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-server-conf" (OuterVolumeSpecName: "server-conf") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671793 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671840 4764 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671850 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671859 4764 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671868 4764 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-pod-info\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671881 4764 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-server-conf\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671890 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bzh8\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-kube-api-access-4bzh8\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671939 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671954 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671967 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671977 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.739956 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.743333 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.776250 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.776308 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.778783 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.797924 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.807977 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:44:36 crc kubenswrapper[4764]: E0309 13:44:36.808564 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c8bf9f-a031-4e56-b1d7-49b407eabaf7" containerName="setup-container" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.808610 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c8bf9f-a031-4e56-b1d7-49b407eabaf7" containerName="setup-container" Mar 09 13:44:36 crc kubenswrapper[4764]: E0309 13:44:36.808627 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="507bcef1-e9ef-4eb1-85ce-358209b944bc" containerName="rabbitmq" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.808635 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="507bcef1-e9ef-4eb1-85ce-358209b944bc" containerName="rabbitmq" Mar 09 13:44:36 crc kubenswrapper[4764]: E0309 13:44:36.808692 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77179ff3-861b-4aab-b1b2-db4d12041264" containerName="oc" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.808704 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="77179ff3-861b-4aab-b1b2-db4d12041264" containerName="oc" Mar 09 13:44:36 crc kubenswrapper[4764]: E0309 13:44:36.808717 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="507bcef1-e9ef-4eb1-85ce-358209b944bc" containerName="setup-container" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.808724 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="507bcef1-e9ef-4eb1-85ce-358209b944bc" containerName="setup-container" Mar 09 13:44:36 crc kubenswrapper[4764]: E0309 13:44:36.808739 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c8bf9f-a031-4e56-b1d7-49b407eabaf7" containerName="rabbitmq" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.808747 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c8bf9f-a031-4e56-b1d7-49b407eabaf7" containerName="rabbitmq" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.808962 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="77179ff3-861b-4aab-b1b2-db4d12041264" containerName="oc" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.808989 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="11c8bf9f-a031-4e56-b1d7-49b407eabaf7" containerName="rabbitmq" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.809006 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="507bcef1-e9ef-4eb1-85ce-358209b944bc" containerName="rabbitmq" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.821470 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.830748 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.833735 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.833898 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.833753 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8dlbf" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.834211 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.835735 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.835968 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.839623 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.878235 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.878926 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b19144b6-cc4c-41d6-ad2e-409c021f657c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.879066 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.879185 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b19144b6-cc4c-41d6-ad2e-409c021f657c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.879308 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.879420 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.879535 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbghk\" (UniqueName: \"kubernetes.io/projected/b19144b6-cc4c-41d6-ad2e-409c021f657c-kube-api-access-dbghk\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.879724 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b19144b6-cc4c-41d6-ad2e-409c021f657c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.879909 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b19144b6-cc4c-41d6-ad2e-409c021f657c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.880035 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.881104 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b19144b6-cc4c-41d6-ad2e-409c021f657c-config-data\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983309 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b19144b6-cc4c-41d6-ad2e-409c021f657c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983379 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b19144b6-cc4c-41d6-ad2e-409c021f657c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983407 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983427 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b19144b6-cc4c-41d6-ad2e-409c021f657c-config-data\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983465 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983536 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b19144b6-cc4c-41d6-ad2e-409c021f657c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983567 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983592 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b19144b6-cc4c-41d6-ad2e-409c021f657c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983612 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983636 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983658 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbghk\" (UniqueName: \"kubernetes.io/projected/b19144b6-cc4c-41d6-ad2e-409c021f657c-kube-api-access-dbghk\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.985261 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.985877 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b19144b6-cc4c-41d6-ad2e-409c021f657c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.990249 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b19144b6-cc4c-41d6-ad2e-409c021f657c-config-data\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.991176 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b19144b6-cc4c-41d6-ad2e-409c021f657c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.994180 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b19144b6-cc4c-41d6-ad2e-409c021f657c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.994333 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.995062 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.995303 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b19144b6-cc4c-41d6-ad2e-409c021f657c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.000746 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.001023 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.003169 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbghk\" (UniqueName: \"kubernetes.io/projected/b19144b6-cc4c-41d6-ad2e-409c021f657c-kube-api-access-dbghk\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.028101 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.150973 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.412651 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"11c8bf9f-a031-4e56-b1d7-49b407eabaf7","Type":"ContainerDied","Data":"b97c921b54e1f12956d845171f6d90fe64a80d32c024a23960cca4b47667dc15"} Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.413082 4764 scope.go:117] "RemoveContainer" containerID="c051454f6de8558a84ceb371da4edc9fd851c2e829bf18eb804c0c4e657ef0ee" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.413265 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.453487 4764 scope.go:117] "RemoveContainer" containerID="fd0a79b758702c401b2bbc87884a5f0ad37053c07ad4da0c2c69610d9e0509c2" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.476753 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.491380 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.510333 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.512095 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.517674 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.517826 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6m67z" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.517878 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.518005 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.518098 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.518191 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.518225 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.526473 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.576049 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11c8bf9f-a031-4e56-b1d7-49b407eabaf7" path="/var/lib/kubelet/pods/11c8bf9f-a031-4e56-b1d7-49b407eabaf7/volumes" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.577326 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="507bcef1-e9ef-4eb1-85ce-358209b944bc" path="/var/lib/kubelet/pods/507bcef1-e9ef-4eb1-85ce-358209b944bc/volumes" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.685844 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.703784 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.703906 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.703995 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5579dd7-5380-4042-8c78-c6837d841d5e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.704067 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.704111 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.704183 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.704364 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c5579dd7-5380-4042-8c78-c6837d841d5e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.704413 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c5579dd7-5380-4042-8c78-c6837d841d5e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.704868 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c5579dd7-5380-4042-8c78-c6837d841d5e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.704976 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwxh2\" (UniqueName: \"kubernetes.io/projected/c5579dd7-5380-4042-8c78-c6837d841d5e-kube-api-access-qwxh2\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.705069 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c5579dd7-5380-4042-8c78-c6837d841d5e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807284 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807368 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807412 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5579dd7-5380-4042-8c78-c6837d841d5e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807430 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807451 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807481 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807566 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c5579dd7-5380-4042-8c78-c6837d841d5e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807588 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c5579dd7-5380-4042-8c78-c6837d841d5e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807615 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c5579dd7-5380-4042-8c78-c6837d841d5e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807642 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwxh2\" (UniqueName: \"kubernetes.io/projected/c5579dd7-5380-4042-8c78-c6837d841d5e-kube-api-access-qwxh2\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807690 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c5579dd7-5380-4042-8c78-c6837d841d5e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.808062 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.808194 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.809029 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c5579dd7-5380-4042-8c78-c6837d841d5e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.809201 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5579dd7-5380-4042-8c78-c6837d841d5e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.809258 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.810082 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c5579dd7-5380-4042-8c78-c6837d841d5e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.813723 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c5579dd7-5380-4042-8c78-c6837d841d5e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.813809 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.816435 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c5579dd7-5380-4042-8c78-c6837d841d5e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.816661 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.833633 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwxh2\" (UniqueName: \"kubernetes.io/projected/c5579dd7-5380-4042-8c78-c6837d841d5e-kube-api-access-qwxh2\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.851547 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.876296 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-f8lbq"] Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.878094 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.880696 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.895755 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.899989 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-f8lbq"] Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.013893 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-nb\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.014670 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-config\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.014723 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-sb\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.015369 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws2cl\" (UniqueName: \"kubernetes.io/projected/3912c156-63a8-4756-bc55-4e403c3807f8-kube-api-access-ws2cl\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.015570 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-openstack-edpm-ipam\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.015595 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-dns-svc\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.132051 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-openstack-edpm-ipam\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.132240 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-dns-svc\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.132358 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-nb\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.132418 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-config\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.132466 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-sb\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.132534 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws2cl\" (UniqueName: \"kubernetes.io/projected/3912c156-63a8-4756-bc55-4e403c3807f8-kube-api-access-ws2cl\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.133255 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-openstack-edpm-ipam\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.137498 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-dns-svc\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.138345 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-config\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.138784 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-nb\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.140175 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-sb\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.162292 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws2cl\" (UniqueName: \"kubernetes.io/projected/3912c156-63a8-4756-bc55-4e403c3807f8-kube-api-access-ws2cl\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.201871 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.425892 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b19144b6-cc4c-41d6-ad2e-409c021f657c","Type":"ContainerStarted","Data":"d5545c1d9524a5328fcea840cbc5054f1b2e1c872c15c4e573f15f8e5aa3158c"} Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.437875 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:44:38 crc kubenswrapper[4764]: W0309 13:44:38.440973 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5579dd7_5380_4042_8c78_c6837d841d5e.slice/crio-351cf9a9ccaf4afea3cf8872dfac012cf03f69c5fcff9ceafa4f3c49e86bbd21 WatchSource:0}: Error finding container 351cf9a9ccaf4afea3cf8872dfac012cf03f69c5fcff9ceafa4f3c49e86bbd21: Status 404 returned error can't find the container with id 351cf9a9ccaf4afea3cf8872dfac012cf03f69c5fcff9ceafa4f3c49e86bbd21 Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.694596 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-f8lbq"] Mar 09 13:44:39 crc kubenswrapper[4764]: I0309 13:44:39.445128 4764 generic.go:334] "Generic (PLEG): container finished" podID="3912c156-63a8-4756-bc55-4e403c3807f8" containerID="56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a" exitCode=0 Mar 09 13:44:39 crc kubenswrapper[4764]: I0309 13:44:39.445258 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" event={"ID":"3912c156-63a8-4756-bc55-4e403c3807f8","Type":"ContainerDied","Data":"56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a"} Mar 09 13:44:39 crc kubenswrapper[4764]: I0309 13:44:39.445836 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" event={"ID":"3912c156-63a8-4756-bc55-4e403c3807f8","Type":"ContainerStarted","Data":"05c41c0e7884eaf2ee86d117de131aea3691b93a29e2fa0912581efbd369cd9a"} Mar 09 13:44:39 crc kubenswrapper[4764]: I0309 13:44:39.447936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c5579dd7-5380-4042-8c78-c6837d841d5e","Type":"ContainerStarted","Data":"351cf9a9ccaf4afea3cf8872dfac012cf03f69c5fcff9ceafa4f3c49e86bbd21"} Mar 09 13:44:40 crc kubenswrapper[4764]: I0309 13:44:40.469640 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" event={"ID":"3912c156-63a8-4756-bc55-4e403c3807f8","Type":"ContainerStarted","Data":"5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af"} Mar 09 13:44:40 crc kubenswrapper[4764]: I0309 13:44:40.470110 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:40 crc kubenswrapper[4764]: I0309 13:44:40.473443 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c5579dd7-5380-4042-8c78-c6837d841d5e","Type":"ContainerStarted","Data":"188de235de55b51ff40610026c7e52499fd41faec0f6ea44af2ec4f4625f4040"} Mar 09 13:44:40 crc kubenswrapper[4764]: I0309 13:44:40.478055 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b19144b6-cc4c-41d6-ad2e-409c021f657c","Type":"ContainerStarted","Data":"4ebc5d892f1f305b7293fcf5f466f2290393eed4c595d382d19c94d2aa4f9a50"} Mar 09 13:44:40 crc kubenswrapper[4764]: I0309 13:44:40.501368 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" podStartSLOduration=3.5013393539999997 podStartE2EDuration="3.501339354s" podCreationTimestamp="2026-03-09 13:44:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:44:40.489646972 +0000 UTC m=+1435.739818890" watchObservedRunningTime="2026-03-09 13:44:40.501339354 +0000 UTC m=+1435.751511262" Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.203968 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.273763 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-bwpkr"] Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.274091 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" podUID="4ae25624-74af-4de4-8aa1-14ea5dbc7b68" containerName="dnsmasq-dns" containerID="cri-o://37fa28134b3929cb4f988c4f648598adc00dd97c039b66d6a61153de431009b4" gracePeriod=10 Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.462514 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-688wh"] Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.464624 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.484551 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-688wh"] Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.575502 4764 generic.go:334] "Generic (PLEG): container finished" podID="4ae25624-74af-4de4-8aa1-14ea5dbc7b68" containerID="37fa28134b3929cb4f988c4f648598adc00dd97c039b66d6a61153de431009b4" exitCode=0 Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.575572 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" event={"ID":"4ae25624-74af-4de4-8aa1-14ea5dbc7b68","Type":"ContainerDied","Data":"37fa28134b3929cb4f988c4f648598adc00dd97c039b66d6a61153de431009b4"} Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.589954 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-sb\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.590025 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-config\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.590049 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j76rw\" (UniqueName: \"kubernetes.io/projected/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-kube-api-access-j76rw\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.590135 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-nb\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.590157 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-openstack-edpm-ipam\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.590201 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-dns-svc\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.691909 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-openstack-edpm-ipam\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.692039 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-dns-svc\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.692128 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-sb\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.692179 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-config\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.692201 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j76rw\" (UniqueName: \"kubernetes.io/projected/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-kube-api-access-j76rw\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.692368 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-nb\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.693307 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-dns-svc\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.693366 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-nb\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.693982 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-openstack-edpm-ipam\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.694568 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-config\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.695392 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-sb\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.721067 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j76rw\" (UniqueName: \"kubernetes.io/projected/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-kube-api-access-j76rw\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.810020 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.830459 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.999523 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-config\") pod \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.999631 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-nb\") pod \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.999790 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jzfc\" (UniqueName: \"kubernetes.io/projected/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-kube-api-access-8jzfc\") pod \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.999971 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-dns-svc\") pod \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.000046 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-sb\") pod \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.007355 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-kube-api-access-8jzfc" (OuterVolumeSpecName: "kube-api-access-8jzfc") pod "4ae25624-74af-4de4-8aa1-14ea5dbc7b68" (UID: "4ae25624-74af-4de4-8aa1-14ea5dbc7b68"). InnerVolumeSpecName "kube-api-access-8jzfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.059890 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4ae25624-74af-4de4-8aa1-14ea5dbc7b68" (UID: "4ae25624-74af-4de4-8aa1-14ea5dbc7b68"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.060228 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-config" (OuterVolumeSpecName: "config") pod "4ae25624-74af-4de4-8aa1-14ea5dbc7b68" (UID: "4ae25624-74af-4de4-8aa1-14ea5dbc7b68"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.060750 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4ae25624-74af-4de4-8aa1-14ea5dbc7b68" (UID: "4ae25624-74af-4de4-8aa1-14ea5dbc7b68"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.061184 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4ae25624-74af-4de4-8aa1-14ea5dbc7b68" (UID: "4ae25624-74af-4de4-8aa1-14ea5dbc7b68"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.102621 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.102670 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.102683 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jzfc\" (UniqueName: \"kubernetes.io/projected/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-kube-api-access-8jzfc\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.102692 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.102709 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.591425 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" event={"ID":"4ae25624-74af-4de4-8aa1-14ea5dbc7b68","Type":"ContainerDied","Data":"3ac89812f31874fa083542c67d0593b58a24f3774ac9cde06937d2e9c1a94aaf"} Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.591481 4764 scope.go:117] "RemoveContainer" containerID="37fa28134b3929cb4f988c4f648598adc00dd97c039b66d6a61153de431009b4" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.591660 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.618233 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-bwpkr"] Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.627314 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-bwpkr"] Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.630673 4764 scope.go:117] "RemoveContainer" containerID="1417bd2805b59981136a1861cd39d3dca99a51d7e7919e8b5c01839da7f91123" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.996151 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-688wh"] Mar 09 13:44:50 crc kubenswrapper[4764]: I0309 13:44:50.607451 4764 generic.go:334] "Generic (PLEG): container finished" podID="6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" containerID="3b50e9b2a9f264bfcb22b787b33fb1bad8b4dde68292e87c3d306716bd5422e2" exitCode=0 Mar 09 13:44:50 crc kubenswrapper[4764]: I0309 13:44:50.607585 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" event={"ID":"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff","Type":"ContainerDied","Data":"3b50e9b2a9f264bfcb22b787b33fb1bad8b4dde68292e87c3d306716bd5422e2"} Mar 09 13:44:50 crc kubenswrapper[4764]: I0309 13:44:50.607696 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" event={"ID":"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff","Type":"ContainerStarted","Data":"7623e23ae56b8d47625b88ad80b059396febc09e926f11449c6996f010201d29"} Mar 09 13:44:51 crc kubenswrapper[4764]: I0309 13:44:51.573174 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ae25624-74af-4de4-8aa1-14ea5dbc7b68" path="/var/lib/kubelet/pods/4ae25624-74af-4de4-8aa1-14ea5dbc7b68/volumes" Mar 09 13:44:51 crc kubenswrapper[4764]: I0309 13:44:51.621568 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" event={"ID":"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff","Type":"ContainerStarted","Data":"0b92f70675fc36bf2d871b5244dabd98b74bc41c0324eee9ebebf35ece42f834"} Mar 09 13:44:51 crc kubenswrapper[4764]: I0309 13:44:51.651937 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" podStartSLOduration=3.651911845 podStartE2EDuration="3.651911845s" podCreationTimestamp="2026-03-09 13:44:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:44:51.643014648 +0000 UTC m=+1446.893186586" watchObservedRunningTime="2026-03-09 13:44:51.651911845 +0000 UTC m=+1446.902083763" Mar 09 13:44:52 crc kubenswrapper[4764]: I0309 13:44:52.631868 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:58 crc kubenswrapper[4764]: I0309 13:44:58.812842 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:58 crc kubenswrapper[4764]: I0309 13:44:58.890412 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-f8lbq"] Mar 09 13:44:58 crc kubenswrapper[4764]: I0309 13:44:58.891611 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" podUID="3912c156-63a8-4756-bc55-4e403c3807f8" containerName="dnsmasq-dns" containerID="cri-o://5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af" gracePeriod=10 Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.389106 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.560854 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws2cl\" (UniqueName: \"kubernetes.io/projected/3912c156-63a8-4756-bc55-4e403c3807f8-kube-api-access-ws2cl\") pod \"3912c156-63a8-4756-bc55-4e403c3807f8\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.560917 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-config\") pod \"3912c156-63a8-4756-bc55-4e403c3807f8\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.560962 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-sb\") pod \"3912c156-63a8-4756-bc55-4e403c3807f8\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.561030 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-openstack-edpm-ipam\") pod \"3912c156-63a8-4756-bc55-4e403c3807f8\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.561167 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-dns-svc\") pod \"3912c156-63a8-4756-bc55-4e403c3807f8\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.561250 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-nb\") pod \"3912c156-63a8-4756-bc55-4e403c3807f8\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.574080 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3912c156-63a8-4756-bc55-4e403c3807f8-kube-api-access-ws2cl" (OuterVolumeSpecName: "kube-api-access-ws2cl") pod "3912c156-63a8-4756-bc55-4e403c3807f8" (UID: "3912c156-63a8-4756-bc55-4e403c3807f8"). InnerVolumeSpecName "kube-api-access-ws2cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.612845 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-config" (OuterVolumeSpecName: "config") pod "3912c156-63a8-4756-bc55-4e403c3807f8" (UID: "3912c156-63a8-4756-bc55-4e403c3807f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.616796 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3912c156-63a8-4756-bc55-4e403c3807f8" (UID: "3912c156-63a8-4756-bc55-4e403c3807f8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.619693 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3912c156-63a8-4756-bc55-4e403c3807f8" (UID: "3912c156-63a8-4756-bc55-4e403c3807f8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.620143 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "3912c156-63a8-4756-bc55-4e403c3807f8" (UID: "3912c156-63a8-4756-bc55-4e403c3807f8"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.620198 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3912c156-63a8-4756-bc55-4e403c3807f8" (UID: "3912c156-63a8-4756-bc55-4e403c3807f8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.663966 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.664006 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws2cl\" (UniqueName: \"kubernetes.io/projected/3912c156-63a8-4756-bc55-4e403c3807f8-kube-api-access-ws2cl\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.664019 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.664029 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.664038 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.664051 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.702510 4764 generic.go:334] "Generic (PLEG): container finished" podID="3912c156-63a8-4756-bc55-4e403c3807f8" containerID="5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af" exitCode=0 Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.702582 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" event={"ID":"3912c156-63a8-4756-bc55-4e403c3807f8","Type":"ContainerDied","Data":"5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af"} Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.702610 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.702718 4764 scope.go:117] "RemoveContainer" containerID="5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.702627 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" event={"ID":"3912c156-63a8-4756-bc55-4e403c3807f8","Type":"ContainerDied","Data":"05c41c0e7884eaf2ee86d117de131aea3691b93a29e2fa0912581efbd369cd9a"} Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.731362 4764 scope.go:117] "RemoveContainer" containerID="56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.743415 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-f8lbq"] Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.751881 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-f8lbq"] Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.781356 4764 scope.go:117] "RemoveContainer" containerID="5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af" Mar 09 13:44:59 crc kubenswrapper[4764]: E0309 13:44:59.782068 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af\": container with ID starting with 5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af not found: ID does not exist" containerID="5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.782137 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af"} err="failed to get container status \"5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af\": rpc error: code = NotFound desc = could not find container \"5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af\": container with ID starting with 5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af not found: ID does not exist" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.782177 4764 scope.go:117] "RemoveContainer" containerID="56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a" Mar 09 13:44:59 crc kubenswrapper[4764]: E0309 13:44:59.783060 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a\": container with ID starting with 56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a not found: ID does not exist" containerID="56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.783108 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a"} err="failed to get container status \"56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a\": rpc error: code = NotFound desc = could not find container \"56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a\": container with ID starting with 56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a not found: ID does not exist" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.151973 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt"] Mar 09 13:45:00 crc kubenswrapper[4764]: E0309 13:45:00.152776 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae25624-74af-4de4-8aa1-14ea5dbc7b68" containerName="dnsmasq-dns" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.152792 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae25624-74af-4de4-8aa1-14ea5dbc7b68" containerName="dnsmasq-dns" Mar 09 13:45:00 crc kubenswrapper[4764]: E0309 13:45:00.152839 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae25624-74af-4de4-8aa1-14ea5dbc7b68" containerName="init" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.152846 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae25624-74af-4de4-8aa1-14ea5dbc7b68" containerName="init" Mar 09 13:45:00 crc kubenswrapper[4764]: E0309 13:45:00.152859 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3912c156-63a8-4756-bc55-4e403c3807f8" containerName="init" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.152865 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3912c156-63a8-4756-bc55-4e403c3807f8" containerName="init" Mar 09 13:45:00 crc kubenswrapper[4764]: E0309 13:45:00.152878 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3912c156-63a8-4756-bc55-4e403c3807f8" containerName="dnsmasq-dns" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.152884 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3912c156-63a8-4756-bc55-4e403c3807f8" containerName="dnsmasq-dns" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.153056 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3912c156-63a8-4756-bc55-4e403c3807f8" containerName="dnsmasq-dns" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.153076 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ae25624-74af-4de4-8aa1-14ea5dbc7b68" containerName="dnsmasq-dns" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.154020 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.158562 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.159845 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.165480 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt"] Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.173449 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-config-volume\") pod \"collect-profiles-29551065-gdnvt\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.173533 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-secret-volume\") pod \"collect-profiles-29551065-gdnvt\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.173703 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hpj2\" (UniqueName: \"kubernetes.io/projected/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-kube-api-access-4hpj2\") pod \"collect-profiles-29551065-gdnvt\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.275718 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-secret-volume\") pod \"collect-profiles-29551065-gdnvt\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.275845 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hpj2\" (UniqueName: \"kubernetes.io/projected/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-kube-api-access-4hpj2\") pod \"collect-profiles-29551065-gdnvt\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.275937 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-config-volume\") pod \"collect-profiles-29551065-gdnvt\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.276875 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-config-volume\") pod \"collect-profiles-29551065-gdnvt\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.284087 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-secret-volume\") pod \"collect-profiles-29551065-gdnvt\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.299226 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hpj2\" (UniqueName: \"kubernetes.io/projected/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-kube-api-access-4hpj2\") pod \"collect-profiles-29551065-gdnvt\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.477926 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.941325 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt"] Mar 09 13:45:01 crc kubenswrapper[4764]: I0309 13:45:01.575941 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3912c156-63a8-4756-bc55-4e403c3807f8" path="/var/lib/kubelet/pods/3912c156-63a8-4756-bc55-4e403c3807f8/volumes" Mar 09 13:45:01 crc kubenswrapper[4764]: I0309 13:45:01.736424 4764 generic.go:334] "Generic (PLEG): container finished" podID="5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d" containerID="797b0150f56f98282613dd9aa20b75479aae0c1306bab3b67c8a86168757b198" exitCode=0 Mar 09 13:45:01 crc kubenswrapper[4764]: I0309 13:45:01.736483 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" event={"ID":"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d","Type":"ContainerDied","Data":"797b0150f56f98282613dd9aa20b75479aae0c1306bab3b67c8a86168757b198"} Mar 09 13:45:01 crc kubenswrapper[4764]: I0309 13:45:01.736524 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" event={"ID":"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d","Type":"ContainerStarted","Data":"6d6a58716a56d842f085937a0ecf38105dd034284c1e2744c05cc090d42d339a"} Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.079072 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.134409 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-secret-volume\") pod \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.134825 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hpj2\" (UniqueName: \"kubernetes.io/projected/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-kube-api-access-4hpj2\") pod \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.142141 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d" (UID: "5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.142187 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-kube-api-access-4hpj2" (OuterVolumeSpecName: "kube-api-access-4hpj2") pod "5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d" (UID: "5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d"). InnerVolumeSpecName "kube-api-access-4hpj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.237081 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-config-volume\") pod \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.237577 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.237608 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hpj2\" (UniqueName: \"kubernetes.io/projected/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-kube-api-access-4hpj2\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.237951 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-config-volume" (OuterVolumeSpecName: "config-volume") pod "5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d" (UID: "5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.340760 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.762411 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" event={"ID":"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d","Type":"ContainerDied","Data":"6d6a58716a56d842f085937a0ecf38105dd034284c1e2744c05cc090d42d339a"} Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.762475 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d6a58716a56d842f085937a0ecf38105dd034284c1e2744c05cc090d42d339a" Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.762476 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:05 crc kubenswrapper[4764]: I0309 13:45:05.013884 4764 scope.go:117] "RemoveContainer" containerID="c0f759ebfc59d520d002517970a3889749936b75779f65069038f6e34fd87723" Mar 09 13:45:08 crc kubenswrapper[4764]: I0309 13:45:08.994097 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7"] Mar 09 13:45:08 crc kubenswrapper[4764]: E0309 13:45:08.995336 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d" containerName="collect-profiles" Mar 09 13:45:08 crc kubenswrapper[4764]: I0309 13:45:08.995353 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d" containerName="collect-profiles" Mar 09 13:45:08 crc kubenswrapper[4764]: I0309 13:45:08.995549 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d" containerName="collect-profiles" Mar 09 13:45:08 crc kubenswrapper[4764]: I0309 13:45:08.996340 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:08 crc kubenswrapper[4764]: I0309 13:45:08.999332 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:08.999996 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.000385 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.000597 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.012680 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7"] Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.170197 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npdx4\" (UniqueName: \"kubernetes.io/projected/07f61b11-aba4-469c-a5ed-9566f1951559-kube-api-access-npdx4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.170361 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.170475 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.170504 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.272291 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.272356 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.272391 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.272479 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npdx4\" (UniqueName: \"kubernetes.io/projected/07f61b11-aba4-469c-a5ed-9566f1951559-kube-api-access-npdx4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.289726 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.293558 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.296953 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.304771 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npdx4\" (UniqueName: \"kubernetes.io/projected/07f61b11-aba4-469c-a5ed-9566f1951559-kube-api-access-npdx4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.328357 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.883728 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7"] Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.891514 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:45:10 crc kubenswrapper[4764]: I0309 13:45:10.834496 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" event={"ID":"07f61b11-aba4-469c-a5ed-9566f1951559","Type":"ContainerStarted","Data":"126e7156c2e3006e0a3f1b9868e0b20082a6b17496796473f809ef087c89923d"} Mar 09 13:45:12 crc kubenswrapper[4764]: I0309 13:45:12.861982 4764 generic.go:334] "Generic (PLEG): container finished" podID="c5579dd7-5380-4042-8c78-c6837d841d5e" containerID="188de235de55b51ff40610026c7e52499fd41faec0f6ea44af2ec4f4625f4040" exitCode=0 Mar 09 13:45:12 crc kubenswrapper[4764]: I0309 13:45:12.862493 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c5579dd7-5380-4042-8c78-c6837d841d5e","Type":"ContainerDied","Data":"188de235de55b51ff40610026c7e52499fd41faec0f6ea44af2ec4f4625f4040"} Mar 09 13:45:12 crc kubenswrapper[4764]: I0309 13:45:12.866600 4764 generic.go:334] "Generic (PLEG): container finished" podID="b19144b6-cc4c-41d6-ad2e-409c021f657c" containerID="4ebc5d892f1f305b7293fcf5f466f2290393eed4c595d382d19c94d2aa4f9a50" exitCode=0 Mar 09 13:45:12 crc kubenswrapper[4764]: I0309 13:45:12.866677 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b19144b6-cc4c-41d6-ad2e-409c021f657c","Type":"ContainerDied","Data":"4ebc5d892f1f305b7293fcf5f466f2290393eed4c595d382d19c94d2aa4f9a50"} Mar 09 13:45:13 crc kubenswrapper[4764]: I0309 13:45:13.877314 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c5579dd7-5380-4042-8c78-c6837d841d5e","Type":"ContainerStarted","Data":"6c3db23c5bed35b1a88f34166c094e7dd5ad5d574e5e1e8d461ac988365c8de5"} Mar 09 13:45:13 crc kubenswrapper[4764]: I0309 13:45:13.878217 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:45:13 crc kubenswrapper[4764]: I0309 13:45:13.879021 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b19144b6-cc4c-41d6-ad2e-409c021f657c","Type":"ContainerStarted","Data":"e39b9f2171082500f2f291393f6f825360523292c0d053435a6028812f74ae5a"} Mar 09 13:45:13 crc kubenswrapper[4764]: I0309 13:45:13.879494 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 09 13:45:13 crc kubenswrapper[4764]: I0309 13:45:13.914105 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.914082065 podStartE2EDuration="36.914082065s" podCreationTimestamp="2026-03-09 13:44:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:45:13.908462675 +0000 UTC m=+1469.158634613" watchObservedRunningTime="2026-03-09 13:45:13.914082065 +0000 UTC m=+1469.164253983" Mar 09 13:45:13 crc kubenswrapper[4764]: I0309 13:45:13.944753 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.944727812 podStartE2EDuration="37.944727812s" podCreationTimestamp="2026-03-09 13:44:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:45:13.934990482 +0000 UTC m=+1469.185162390" watchObservedRunningTime="2026-03-09 13:45:13.944727812 +0000 UTC m=+1469.194899720" Mar 09 13:45:19 crc kubenswrapper[4764]: I0309 13:45:19.942834 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" event={"ID":"07f61b11-aba4-469c-a5ed-9566f1951559","Type":"ContainerStarted","Data":"bde45a06faa972981715117bca5bea2ffa8f521b6f7a639569a4330b9afefe5a"} Mar 09 13:45:19 crc kubenswrapper[4764]: I0309 13:45:19.963808 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" podStartSLOduration=2.64079206 podStartE2EDuration="11.963784219s" podCreationTimestamp="2026-03-09 13:45:08 +0000 UTC" firstStartedPulling="2026-03-09 13:45:09.891264778 +0000 UTC m=+1465.141436686" lastFinishedPulling="2026-03-09 13:45:19.214256937 +0000 UTC m=+1474.464428845" observedRunningTime="2026-03-09 13:45:19.9623275 +0000 UTC m=+1475.212499428" watchObservedRunningTime="2026-03-09 13:45:19.963784219 +0000 UTC m=+1475.213956147" Mar 09 13:45:27 crc kubenswrapper[4764]: I0309 13:45:27.155020 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 09 13:45:27 crc kubenswrapper[4764]: I0309 13:45:27.900537 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:45:30 crc kubenswrapper[4764]: I0309 13:45:30.044924 4764 generic.go:334] "Generic (PLEG): container finished" podID="07f61b11-aba4-469c-a5ed-9566f1951559" containerID="bde45a06faa972981715117bca5bea2ffa8f521b6f7a639569a4330b9afefe5a" exitCode=0 Mar 09 13:45:30 crc kubenswrapper[4764]: I0309 13:45:30.045021 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" event={"ID":"07f61b11-aba4-469c-a5ed-9566f1951559","Type":"ContainerDied","Data":"bde45a06faa972981715117bca5bea2ffa8f521b6f7a639569a4330b9afefe5a"} Mar 09 13:45:31 crc kubenswrapper[4764]: I0309 13:45:31.783227 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:31 crc kubenswrapper[4764]: I0309 13:45:31.923296 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-ssh-key-openstack-edpm-ipam\") pod \"07f61b11-aba4-469c-a5ed-9566f1951559\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " Mar 09 13:45:31 crc kubenswrapper[4764]: I0309 13:45:31.923688 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npdx4\" (UniqueName: \"kubernetes.io/projected/07f61b11-aba4-469c-a5ed-9566f1951559-kube-api-access-npdx4\") pod \"07f61b11-aba4-469c-a5ed-9566f1951559\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " Mar 09 13:45:31 crc kubenswrapper[4764]: I0309 13:45:31.923838 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-inventory\") pod \"07f61b11-aba4-469c-a5ed-9566f1951559\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " Mar 09 13:45:31 crc kubenswrapper[4764]: I0309 13:45:31.924270 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-repo-setup-combined-ca-bundle\") pod \"07f61b11-aba4-469c-a5ed-9566f1951559\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " Mar 09 13:45:31 crc kubenswrapper[4764]: I0309 13:45:31.931096 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "07f61b11-aba4-469c-a5ed-9566f1951559" (UID: "07f61b11-aba4-469c-a5ed-9566f1951559"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:45:31 crc kubenswrapper[4764]: I0309 13:45:31.935687 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07f61b11-aba4-469c-a5ed-9566f1951559-kube-api-access-npdx4" (OuterVolumeSpecName: "kube-api-access-npdx4") pod "07f61b11-aba4-469c-a5ed-9566f1951559" (UID: "07f61b11-aba4-469c-a5ed-9566f1951559"). InnerVolumeSpecName "kube-api-access-npdx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:45:31 crc kubenswrapper[4764]: I0309 13:45:31.952661 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "07f61b11-aba4-469c-a5ed-9566f1951559" (UID: "07f61b11-aba4-469c-a5ed-9566f1951559"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:45:31 crc kubenswrapper[4764]: I0309 13:45:31.954889 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-inventory" (OuterVolumeSpecName: "inventory") pod "07f61b11-aba4-469c-a5ed-9566f1951559" (UID: "07f61b11-aba4-469c-a5ed-9566f1951559"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.026913 4764 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.026964 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.026976 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npdx4\" (UniqueName: \"kubernetes.io/projected/07f61b11-aba4-469c-a5ed-9566f1951559-kube-api-access-npdx4\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.026987 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.073814 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" event={"ID":"07f61b11-aba4-469c-a5ed-9566f1951559","Type":"ContainerDied","Data":"126e7156c2e3006e0a3f1b9868e0b20082a6b17496796473f809ef087c89923d"} Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.073878 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="126e7156c2e3006e0a3f1b9868e0b20082a6b17496796473f809ef087c89923d" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.074027 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.154721 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr"] Mar 09 13:45:32 crc kubenswrapper[4764]: E0309 13:45:32.155264 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f61b11-aba4-469c-a5ed-9566f1951559" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.155287 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f61b11-aba4-469c-a5ed-9566f1951559" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.155482 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="07f61b11-aba4-469c-a5ed-9566f1951559" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.156279 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.159714 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.160831 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.161043 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.164098 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.177722 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr"] Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.231542 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.231633 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrpw2\" (UniqueName: \"kubernetes.io/projected/fe0d9990-083b-428b-baec-a40ae99487db-kube-api-access-rrpw2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.231681 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.231713 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.335021 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.335237 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrpw2\" (UniqueName: \"kubernetes.io/projected/fe0d9990-083b-428b-baec-a40ae99487db-kube-api-access-rrpw2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.335288 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.335335 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.340737 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.341312 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.346559 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.360691 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrpw2\" (UniqueName: \"kubernetes.io/projected/fe0d9990-083b-428b-baec-a40ae99487db-kube-api-access-rrpw2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.476950 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:33 crc kubenswrapper[4764]: I0309 13:45:33.034617 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr"] Mar 09 13:45:33 crc kubenswrapper[4764]: W0309 13:45:33.036102 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe0d9990_083b_428b_baec_a40ae99487db.slice/crio-88f1f6de4e55f98cf3fb91581e4ad15ec69d81b47c69d9187bda289cfb67da48 WatchSource:0}: Error finding container 88f1f6de4e55f98cf3fb91581e4ad15ec69d81b47c69d9187bda289cfb67da48: Status 404 returned error can't find the container with id 88f1f6de4e55f98cf3fb91581e4ad15ec69d81b47c69d9187bda289cfb67da48 Mar 09 13:45:33 crc kubenswrapper[4764]: I0309 13:45:33.085800 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" event={"ID":"fe0d9990-083b-428b-baec-a40ae99487db","Type":"ContainerStarted","Data":"88f1f6de4e55f98cf3fb91581e4ad15ec69d81b47c69d9187bda289cfb67da48"} Mar 09 13:45:34 crc kubenswrapper[4764]: I0309 13:45:34.099152 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" event={"ID":"fe0d9990-083b-428b-baec-a40ae99487db","Type":"ContainerStarted","Data":"6e4f8f0feb8ec9ecf660634549716b0c350cba173b98c033e4c7e09aa6bd108d"} Mar 09 13:45:34 crc kubenswrapper[4764]: I0309 13:45:34.126799 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" podStartSLOduration=1.692755281 podStartE2EDuration="2.126773931s" podCreationTimestamp="2026-03-09 13:45:32 +0000 UTC" firstStartedPulling="2026-03-09 13:45:33.041133228 +0000 UTC m=+1488.291305136" lastFinishedPulling="2026-03-09 13:45:33.475151878 +0000 UTC m=+1488.725323786" observedRunningTime="2026-03-09 13:45:34.119027624 +0000 UTC m=+1489.369199542" watchObservedRunningTime="2026-03-09 13:45:34.126773931 +0000 UTC m=+1489.376945849" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.107078 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m5wvh"] Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.110691 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.117029 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m5wvh"] Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.186505 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhw7v\" (UniqueName: \"kubernetes.io/projected/17216f70-2204-498b-9a97-97d6ce40bd8d-kube-api-access-fhw7v\") pod \"redhat-operators-m5wvh\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.186733 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-catalog-content\") pod \"redhat-operators-m5wvh\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.186776 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-utilities\") pod \"redhat-operators-m5wvh\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.290065 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhw7v\" (UniqueName: \"kubernetes.io/projected/17216f70-2204-498b-9a97-97d6ce40bd8d-kube-api-access-fhw7v\") pod \"redhat-operators-m5wvh\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.290202 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-catalog-content\") pod \"redhat-operators-m5wvh\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.290246 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-utilities\") pod \"redhat-operators-m5wvh\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.291006 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-utilities\") pod \"redhat-operators-m5wvh\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.291086 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-catalog-content\") pod \"redhat-operators-m5wvh\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.313543 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhw7v\" (UniqueName: \"kubernetes.io/projected/17216f70-2204-498b-9a97-97d6ce40bd8d-kube-api-access-fhw7v\") pod \"redhat-operators-m5wvh\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.442545 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.978334 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m5wvh"] Mar 09 13:45:48 crc kubenswrapper[4764]: I0309 13:45:48.262113 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5wvh" event={"ID":"17216f70-2204-498b-9a97-97d6ce40bd8d","Type":"ContainerStarted","Data":"6365ac478b93353f9f38b36c4bfb228fb56e0098a8f6f51ebddbad0e5763fd55"} Mar 09 13:45:49 crc kubenswrapper[4764]: I0309 13:45:49.273302 4764 generic.go:334] "Generic (PLEG): container finished" podID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerID="4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1" exitCode=0 Mar 09 13:45:49 crc kubenswrapper[4764]: I0309 13:45:49.273393 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5wvh" event={"ID":"17216f70-2204-498b-9a97-97d6ce40bd8d","Type":"ContainerDied","Data":"4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1"} Mar 09 13:45:53 crc kubenswrapper[4764]: I0309 13:45:53.320676 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5wvh" event={"ID":"17216f70-2204-498b-9a97-97d6ce40bd8d","Type":"ContainerStarted","Data":"1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e"} Mar 09 13:45:55 crc kubenswrapper[4764]: I0309 13:45:55.346269 4764 generic.go:334] "Generic (PLEG): container finished" podID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerID="1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e" exitCode=0 Mar 09 13:45:55 crc kubenswrapper[4764]: I0309 13:45:55.346325 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5wvh" event={"ID":"17216f70-2204-498b-9a97-97d6ce40bd8d","Type":"ContainerDied","Data":"1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e"} Mar 09 13:45:56 crc kubenswrapper[4764]: I0309 13:45:56.366507 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5wvh" event={"ID":"17216f70-2204-498b-9a97-97d6ce40bd8d","Type":"ContainerStarted","Data":"fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7"} Mar 09 13:45:56 crc kubenswrapper[4764]: I0309 13:45:56.396246 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m5wvh" podStartSLOduration=3.95046263 podStartE2EDuration="9.396220113s" podCreationTimestamp="2026-03-09 13:45:47 +0000 UTC" firstStartedPulling="2026-03-09 13:45:50.287468895 +0000 UTC m=+1505.537640803" lastFinishedPulling="2026-03-09 13:45:55.733226358 +0000 UTC m=+1510.983398286" observedRunningTime="2026-03-09 13:45:56.39423434 +0000 UTC m=+1511.644406268" watchObservedRunningTime="2026-03-09 13:45:56.396220113 +0000 UTC m=+1511.646392051" Mar 09 13:45:57 crc kubenswrapper[4764]: I0309 13:45:57.442703 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:57 crc kubenswrapper[4764]: I0309 13:45:57.443822 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:58 crc kubenswrapper[4764]: I0309 13:45:58.488538 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m5wvh" podUID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerName="registry-server" probeResult="failure" output=< Mar 09 13:45:58 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 09 13:45:58 crc kubenswrapper[4764]: > Mar 09 13:46:00 crc kubenswrapper[4764]: I0309 13:46:00.145934 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551066-q2m88"] Mar 09 13:46:00 crc kubenswrapper[4764]: I0309 13:46:00.147948 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551066-q2m88" Mar 09 13:46:00 crc kubenswrapper[4764]: I0309 13:46:00.150951 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:46:00 crc kubenswrapper[4764]: I0309 13:46:00.151376 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:46:00 crc kubenswrapper[4764]: I0309 13:46:00.154589 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:46:00 crc kubenswrapper[4764]: I0309 13:46:00.179333 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551066-q2m88"] Mar 09 13:46:00 crc kubenswrapper[4764]: I0309 13:46:00.306961 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pffnw\" (UniqueName: \"kubernetes.io/projected/2f277802-4cc0-41e2-90f9-a9e2ac441979-kube-api-access-pffnw\") pod \"auto-csr-approver-29551066-q2m88\" (UID: \"2f277802-4cc0-41e2-90f9-a9e2ac441979\") " pod="openshift-infra/auto-csr-approver-29551066-q2m88" Mar 09 13:46:00 crc kubenswrapper[4764]: I0309 13:46:00.410098 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pffnw\" (UniqueName: \"kubernetes.io/projected/2f277802-4cc0-41e2-90f9-a9e2ac441979-kube-api-access-pffnw\") pod \"auto-csr-approver-29551066-q2m88\" (UID: \"2f277802-4cc0-41e2-90f9-a9e2ac441979\") " pod="openshift-infra/auto-csr-approver-29551066-q2m88" Mar 09 13:46:00 crc kubenswrapper[4764]: I0309 13:46:00.433204 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pffnw\" (UniqueName: \"kubernetes.io/projected/2f277802-4cc0-41e2-90f9-a9e2ac441979-kube-api-access-pffnw\") pod \"auto-csr-approver-29551066-q2m88\" (UID: \"2f277802-4cc0-41e2-90f9-a9e2ac441979\") " pod="openshift-infra/auto-csr-approver-29551066-q2m88" Mar 09 13:46:00 crc kubenswrapper[4764]: I0309 13:46:00.506463 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551066-q2m88" Mar 09 13:46:01 crc kubenswrapper[4764]: I0309 13:46:01.013418 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551066-q2m88"] Mar 09 13:46:01 crc kubenswrapper[4764]: I0309 13:46:01.415509 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551066-q2m88" event={"ID":"2f277802-4cc0-41e2-90f9-a9e2ac441979","Type":"ContainerStarted","Data":"e09f6c465a0041bea1efa258d3a816638c67fbb6e52766cc7015b1ba22284328"} Mar 09 13:46:02 crc kubenswrapper[4764]: I0309 13:46:02.427666 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551066-q2m88" event={"ID":"2f277802-4cc0-41e2-90f9-a9e2ac441979","Type":"ContainerStarted","Data":"ad8cc18bf0e9a68496606eb3c3aef5b4008faa06bd6343718c7f8c425fd14c07"} Mar 09 13:46:03 crc kubenswrapper[4764]: I0309 13:46:03.441402 4764 generic.go:334] "Generic (PLEG): container finished" podID="2f277802-4cc0-41e2-90f9-a9e2ac441979" containerID="ad8cc18bf0e9a68496606eb3c3aef5b4008faa06bd6343718c7f8c425fd14c07" exitCode=0 Mar 09 13:46:03 crc kubenswrapper[4764]: I0309 13:46:03.441455 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551066-q2m88" event={"ID":"2f277802-4cc0-41e2-90f9-a9e2ac441979","Type":"ContainerDied","Data":"ad8cc18bf0e9a68496606eb3c3aef5b4008faa06bd6343718c7f8c425fd14c07"} Mar 09 13:46:04 crc kubenswrapper[4764]: I0309 13:46:04.876715 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551066-q2m88" Mar 09 13:46:05 crc kubenswrapper[4764]: I0309 13:46:05.013480 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pffnw\" (UniqueName: \"kubernetes.io/projected/2f277802-4cc0-41e2-90f9-a9e2ac441979-kube-api-access-pffnw\") pod \"2f277802-4cc0-41e2-90f9-a9e2ac441979\" (UID: \"2f277802-4cc0-41e2-90f9-a9e2ac441979\") " Mar 09 13:46:05 crc kubenswrapper[4764]: I0309 13:46:05.021596 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f277802-4cc0-41e2-90f9-a9e2ac441979-kube-api-access-pffnw" (OuterVolumeSpecName: "kube-api-access-pffnw") pod "2f277802-4cc0-41e2-90f9-a9e2ac441979" (UID: "2f277802-4cc0-41e2-90f9-a9e2ac441979"). InnerVolumeSpecName "kube-api-access-pffnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:46:05 crc kubenswrapper[4764]: I0309 13:46:05.116341 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pffnw\" (UniqueName: \"kubernetes.io/projected/2f277802-4cc0-41e2-90f9-a9e2ac441979-kube-api-access-pffnw\") on node \"crc\" DevicePath \"\"" Mar 09 13:46:05 crc kubenswrapper[4764]: I0309 13:46:05.178518 4764 scope.go:117] "RemoveContainer" containerID="9847def4972a082de5de5ce66af1aea1d5c2f1147d6cb0e73ffadb728f7879a5" Mar 09 13:46:05 crc kubenswrapper[4764]: I0309 13:46:05.469572 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551066-q2m88" event={"ID":"2f277802-4cc0-41e2-90f9-a9e2ac441979","Type":"ContainerDied","Data":"e09f6c465a0041bea1efa258d3a816638c67fbb6e52766cc7015b1ba22284328"} Mar 09 13:46:05 crc kubenswrapper[4764]: I0309 13:46:05.469625 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e09f6c465a0041bea1efa258d3a816638c67fbb6e52766cc7015b1ba22284328" Mar 09 13:46:05 crc kubenswrapper[4764]: I0309 13:46:05.469708 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551066-q2m88" Mar 09 13:46:05 crc kubenswrapper[4764]: I0309 13:46:05.532397 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551060-n7f54"] Mar 09 13:46:05 crc kubenswrapper[4764]: I0309 13:46:05.550477 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551060-n7f54"] Mar 09 13:46:05 crc kubenswrapper[4764]: I0309 13:46:05.574399 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16623a65-1bef-4faa-a891-bae0a7d04977" path="/var/lib/kubelet/pods/16623a65-1bef-4faa-a891-bae0a7d04977/volumes" Mar 09 13:46:07 crc kubenswrapper[4764]: I0309 13:46:07.498776 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:46:07 crc kubenswrapper[4764]: I0309 13:46:07.571430 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:46:07 crc kubenswrapper[4764]: I0309 13:46:07.753445 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m5wvh"] Mar 09 13:46:09 crc kubenswrapper[4764]: I0309 13:46:09.523349 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m5wvh" podUID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerName="registry-server" containerID="cri-o://fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7" gracePeriod=2 Mar 09 13:46:09 crc kubenswrapper[4764]: I0309 13:46:09.992585 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.128831 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-catalog-content\") pod \"17216f70-2204-498b-9a97-97d6ce40bd8d\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.128931 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhw7v\" (UniqueName: \"kubernetes.io/projected/17216f70-2204-498b-9a97-97d6ce40bd8d-kube-api-access-fhw7v\") pod \"17216f70-2204-498b-9a97-97d6ce40bd8d\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.129266 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-utilities\") pod \"17216f70-2204-498b-9a97-97d6ce40bd8d\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.129943 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-utilities" (OuterVolumeSpecName: "utilities") pod "17216f70-2204-498b-9a97-97d6ce40bd8d" (UID: "17216f70-2204-498b-9a97-97d6ce40bd8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.135499 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17216f70-2204-498b-9a97-97d6ce40bd8d-kube-api-access-fhw7v" (OuterVolumeSpecName: "kube-api-access-fhw7v") pod "17216f70-2204-498b-9a97-97d6ce40bd8d" (UID: "17216f70-2204-498b-9a97-97d6ce40bd8d"). InnerVolumeSpecName "kube-api-access-fhw7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.231593 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.231666 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhw7v\" (UniqueName: \"kubernetes.io/projected/17216f70-2204-498b-9a97-97d6ce40bd8d-kube-api-access-fhw7v\") on node \"crc\" DevicePath \"\"" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.272898 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17216f70-2204-498b-9a97-97d6ce40bd8d" (UID: "17216f70-2204-498b-9a97-97d6ce40bd8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.334232 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.539515 4764 generic.go:334] "Generic (PLEG): container finished" podID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerID="fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7" exitCode=0 Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.539580 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5wvh" event={"ID":"17216f70-2204-498b-9a97-97d6ce40bd8d","Type":"ContainerDied","Data":"fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7"} Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.539611 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.539695 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5wvh" event={"ID":"17216f70-2204-498b-9a97-97d6ce40bd8d","Type":"ContainerDied","Data":"6365ac478b93353f9f38b36c4bfb228fb56e0098a8f6f51ebddbad0e5763fd55"} Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.539724 4764 scope.go:117] "RemoveContainer" containerID="fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.566829 4764 scope.go:117] "RemoveContainer" containerID="1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.596339 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m5wvh"] Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.603305 4764 scope.go:117] "RemoveContainer" containerID="4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.605610 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m5wvh"] Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.644551 4764 scope.go:117] "RemoveContainer" containerID="fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7" Mar 09 13:46:10 crc kubenswrapper[4764]: E0309 13:46:10.645245 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7\": container with ID starting with fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7 not found: ID does not exist" containerID="fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.645289 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7"} err="failed to get container status \"fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7\": rpc error: code = NotFound desc = could not find container \"fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7\": container with ID starting with fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7 not found: ID does not exist" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.645319 4764 scope.go:117] "RemoveContainer" containerID="1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e" Mar 09 13:46:10 crc kubenswrapper[4764]: E0309 13:46:10.645864 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e\": container with ID starting with 1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e not found: ID does not exist" containerID="1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.645897 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e"} err="failed to get container status \"1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e\": rpc error: code = NotFound desc = could not find container \"1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e\": container with ID starting with 1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e not found: ID does not exist" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.645920 4764 scope.go:117] "RemoveContainer" containerID="4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1" Mar 09 13:46:10 crc kubenswrapper[4764]: E0309 13:46:10.646273 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1\": container with ID starting with 4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1 not found: ID does not exist" containerID="4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.646314 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1"} err="failed to get container status \"4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1\": rpc error: code = NotFound desc = could not find container \"4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1\": container with ID starting with 4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1 not found: ID does not exist" Mar 09 13:46:11 crc kubenswrapper[4764]: I0309 13:46:11.570612 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17216f70-2204-498b-9a97-97d6ce40bd8d" path="/var/lib/kubelet/pods/17216f70-2204-498b-9a97-97d6ce40bd8d/volumes" Mar 09 13:46:28 crc kubenswrapper[4764]: I0309 13:46:28.370324 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:46:28 crc kubenswrapper[4764]: I0309 13:46:28.371023 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:46:58 crc kubenswrapper[4764]: I0309 13:46:58.370675 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:46:58 crc kubenswrapper[4764]: I0309 13:46:58.371422 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:47:05 crc kubenswrapper[4764]: I0309 13:47:05.256179 4764 scope.go:117] "RemoveContainer" containerID="14bfbdfc3fcd7dcb9efec055a62cdfeec5d1e0e90e453a55c55ffd01ca49ad5a" Mar 09 13:47:05 crc kubenswrapper[4764]: I0309 13:47:05.312440 4764 scope.go:117] "RemoveContainer" containerID="e53ea6806faef5cb682bac2b7668fda78ebb3a7b30791520082d7ab2a13f5aa0" Mar 09 13:47:05 crc kubenswrapper[4764]: I0309 13:47:05.378428 4764 scope.go:117] "RemoveContainer" containerID="e74ba0435a2089e000c8df414a7a0d71d67c7b4a00cc240811cbefae67ccd184" Mar 09 13:47:05 crc kubenswrapper[4764]: I0309 13:47:05.405199 4764 scope.go:117] "RemoveContainer" containerID="8959c8a0509c3231f91de89d62e7a7d8e52e391a4941e4498d6d53a3afa1efea" Mar 09 13:47:05 crc kubenswrapper[4764]: I0309 13:47:05.452898 4764 scope.go:117] "RemoveContainer" containerID="f865dbb42b23c282654ed11e8388916b4f7e6868644331082fbaed39e3cbb723" Mar 09 13:47:28 crc kubenswrapper[4764]: I0309 13:47:28.370089 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:47:28 crc kubenswrapper[4764]: I0309 13:47:28.370841 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:47:28 crc kubenswrapper[4764]: I0309 13:47:28.370902 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:47:28 crc kubenswrapper[4764]: I0309 13:47:28.371852 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:47:28 crc kubenswrapper[4764]: I0309 13:47:28.371900 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" gracePeriod=600 Mar 09 13:47:28 crc kubenswrapper[4764]: E0309 13:47:28.492425 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:47:29 crc kubenswrapper[4764]: I0309 13:47:29.420599 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" exitCode=0 Mar 09 13:47:29 crc kubenswrapper[4764]: I0309 13:47:29.420691 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb"} Mar 09 13:47:29 crc kubenswrapper[4764]: I0309 13:47:29.421115 4764 scope.go:117] "RemoveContainer" containerID="69810f80271be58962525ba5a5a37ce68d5e172f7c253c440a5a45f3f3beab77" Mar 09 13:47:29 crc kubenswrapper[4764]: I0309 13:47:29.421998 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:47:29 crc kubenswrapper[4764]: E0309 13:47:29.422325 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:47:43 crc kubenswrapper[4764]: I0309 13:47:43.561555 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:47:43 crc kubenswrapper[4764]: E0309 13:47:43.562641 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:47:54 crc kubenswrapper[4764]: I0309 13:47:54.932864 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nxs42"] Mar 09 13:47:54 crc kubenswrapper[4764]: E0309 13:47:54.934128 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f277802-4cc0-41e2-90f9-a9e2ac441979" containerName="oc" Mar 09 13:47:54 crc kubenswrapper[4764]: I0309 13:47:54.934145 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f277802-4cc0-41e2-90f9-a9e2ac441979" containerName="oc" Mar 09 13:47:54 crc kubenswrapper[4764]: E0309 13:47:54.934166 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerName="extract-utilities" Mar 09 13:47:54 crc kubenswrapper[4764]: I0309 13:47:54.934176 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerName="extract-utilities" Mar 09 13:47:54 crc kubenswrapper[4764]: E0309 13:47:54.934220 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerName="extract-content" Mar 09 13:47:54 crc kubenswrapper[4764]: I0309 13:47:54.934230 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerName="extract-content" Mar 09 13:47:54 crc kubenswrapper[4764]: E0309 13:47:54.934257 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerName="registry-server" Mar 09 13:47:54 crc kubenswrapper[4764]: I0309 13:47:54.934266 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerName="registry-server" Mar 09 13:47:54 crc kubenswrapper[4764]: I0309 13:47:54.934483 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerName="registry-server" Mar 09 13:47:54 crc kubenswrapper[4764]: I0309 13:47:54.934514 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f277802-4cc0-41e2-90f9-a9e2ac441979" containerName="oc" Mar 09 13:47:54 crc kubenswrapper[4764]: I0309 13:47:54.936680 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:54 crc kubenswrapper[4764]: I0309 13:47:54.946955 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nxs42"] Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.072625 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-utilities\") pod \"community-operators-nxs42\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.072811 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6gwd\" (UniqueName: \"kubernetes.io/projected/1852f78c-18c6-481e-bf04-c3eba97b11e7-kube-api-access-q6gwd\") pod \"community-operators-nxs42\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.072839 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-catalog-content\") pod \"community-operators-nxs42\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.174802 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6gwd\" (UniqueName: \"kubernetes.io/projected/1852f78c-18c6-481e-bf04-c3eba97b11e7-kube-api-access-q6gwd\") pod \"community-operators-nxs42\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.174869 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-catalog-content\") pod \"community-operators-nxs42\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.174986 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-utilities\") pod \"community-operators-nxs42\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.175566 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-utilities\") pod \"community-operators-nxs42\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.175585 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-catalog-content\") pod \"community-operators-nxs42\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.198107 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6gwd\" (UniqueName: \"kubernetes.io/projected/1852f78c-18c6-481e-bf04-c3eba97b11e7-kube-api-access-q6gwd\") pod \"community-operators-nxs42\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.302116 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.910426 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nxs42"] Mar 09 13:47:56 crc kubenswrapper[4764]: I0309 13:47:56.746406 4764 generic.go:334] "Generic (PLEG): container finished" podID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerID="be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147" exitCode=0 Mar 09 13:47:56 crc kubenswrapper[4764]: I0309 13:47:56.746488 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxs42" event={"ID":"1852f78c-18c6-481e-bf04-c3eba97b11e7","Type":"ContainerDied","Data":"be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147"} Mar 09 13:47:56 crc kubenswrapper[4764]: I0309 13:47:56.746829 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxs42" event={"ID":"1852f78c-18c6-481e-bf04-c3eba97b11e7","Type":"ContainerStarted","Data":"296edd151f10a4f45fecda1a67c35226a5a35655cc126212c50ad827e9d7aed5"} Mar 09 13:47:57 crc kubenswrapper[4764]: I0309 13:47:57.560340 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:47:57 crc kubenswrapper[4764]: E0309 13:47:57.560782 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:47:58 crc kubenswrapper[4764]: I0309 13:47:58.768180 4764 generic.go:334] "Generic (PLEG): container finished" podID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerID="379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85" exitCode=0 Mar 09 13:47:58 crc kubenswrapper[4764]: I0309 13:47:58.768242 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxs42" event={"ID":"1852f78c-18c6-481e-bf04-c3eba97b11e7","Type":"ContainerDied","Data":"379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85"} Mar 09 13:47:59 crc kubenswrapper[4764]: I0309 13:47:59.783331 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxs42" event={"ID":"1852f78c-18c6-481e-bf04-c3eba97b11e7","Type":"ContainerStarted","Data":"06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573"} Mar 09 13:47:59 crc kubenswrapper[4764]: I0309 13:47:59.807630 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nxs42" podStartSLOduration=3.321709317 podStartE2EDuration="5.807609756s" podCreationTimestamp="2026-03-09 13:47:54 +0000 UTC" firstStartedPulling="2026-03-09 13:47:56.74942312 +0000 UTC m=+1631.999595028" lastFinishedPulling="2026-03-09 13:47:59.235323559 +0000 UTC m=+1634.485495467" observedRunningTime="2026-03-09 13:47:59.805752337 +0000 UTC m=+1635.055924275" watchObservedRunningTime="2026-03-09 13:47:59.807609756 +0000 UTC m=+1635.057781674" Mar 09 13:48:00 crc kubenswrapper[4764]: I0309 13:48:00.157754 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551068-v6md5"] Mar 09 13:48:00 crc kubenswrapper[4764]: I0309 13:48:00.159488 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551068-v6md5" Mar 09 13:48:00 crc kubenswrapper[4764]: I0309 13:48:00.162211 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:48:00 crc kubenswrapper[4764]: I0309 13:48:00.164125 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:48:00 crc kubenswrapper[4764]: I0309 13:48:00.164812 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:48:00 crc kubenswrapper[4764]: I0309 13:48:00.170371 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551068-v6md5"] Mar 09 13:48:00 crc kubenswrapper[4764]: I0309 13:48:00.294193 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7trk\" (UniqueName: \"kubernetes.io/projected/ab11b944-7857-4998-b32b-264ac7683616-kube-api-access-w7trk\") pod \"auto-csr-approver-29551068-v6md5\" (UID: \"ab11b944-7857-4998-b32b-264ac7683616\") " pod="openshift-infra/auto-csr-approver-29551068-v6md5" Mar 09 13:48:00 crc kubenswrapper[4764]: I0309 13:48:00.395922 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7trk\" (UniqueName: \"kubernetes.io/projected/ab11b944-7857-4998-b32b-264ac7683616-kube-api-access-w7trk\") pod \"auto-csr-approver-29551068-v6md5\" (UID: \"ab11b944-7857-4998-b32b-264ac7683616\") " pod="openshift-infra/auto-csr-approver-29551068-v6md5" Mar 09 13:48:00 crc kubenswrapper[4764]: I0309 13:48:00.423458 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7trk\" (UniqueName: \"kubernetes.io/projected/ab11b944-7857-4998-b32b-264ac7683616-kube-api-access-w7trk\") pod \"auto-csr-approver-29551068-v6md5\" (UID: \"ab11b944-7857-4998-b32b-264ac7683616\") " pod="openshift-infra/auto-csr-approver-29551068-v6md5" Mar 09 13:48:00 crc kubenswrapper[4764]: I0309 13:48:00.484222 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551068-v6md5" Mar 09 13:48:01 crc kubenswrapper[4764]: I0309 13:48:00.993800 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551068-v6md5"] Mar 09 13:48:01 crc kubenswrapper[4764]: I0309 13:48:01.812061 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551068-v6md5" event={"ID":"ab11b944-7857-4998-b32b-264ac7683616","Type":"ContainerStarted","Data":"f126d2e1b174f599134d0ceb8f1afe3dc279a52c5b8d7026721c430819068c70"} Mar 09 13:48:02 crc kubenswrapper[4764]: I0309 13:48:02.824919 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551068-v6md5" event={"ID":"ab11b944-7857-4998-b32b-264ac7683616","Type":"ContainerStarted","Data":"2a7ab8e616981504d9d31e4fc4313f083401f97f7e60e7b3cdc2825dfc09335b"} Mar 09 13:48:02 crc kubenswrapper[4764]: I0309 13:48:02.851122 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551068-v6md5" podStartSLOduration=1.541986385 podStartE2EDuration="2.851096952s" podCreationTimestamp="2026-03-09 13:48:00 +0000 UTC" firstStartedPulling="2026-03-09 13:48:01.00170779 +0000 UTC m=+1636.251879698" lastFinishedPulling="2026-03-09 13:48:02.310818357 +0000 UTC m=+1637.560990265" observedRunningTime="2026-03-09 13:48:02.841428774 +0000 UTC m=+1638.091600702" watchObservedRunningTime="2026-03-09 13:48:02.851096952 +0000 UTC m=+1638.101268860" Mar 09 13:48:03 crc kubenswrapper[4764]: I0309 13:48:03.840686 4764 generic.go:334] "Generic (PLEG): container finished" podID="ab11b944-7857-4998-b32b-264ac7683616" containerID="2a7ab8e616981504d9d31e4fc4313f083401f97f7e60e7b3cdc2825dfc09335b" exitCode=0 Mar 09 13:48:03 crc kubenswrapper[4764]: I0309 13:48:03.840821 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551068-v6md5" event={"ID":"ab11b944-7857-4998-b32b-264ac7683616","Type":"ContainerDied","Data":"2a7ab8e616981504d9d31e4fc4313f083401f97f7e60e7b3cdc2825dfc09335b"} Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.293996 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551068-v6md5" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.302989 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.303089 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.367368 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.450858 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7trk\" (UniqueName: \"kubernetes.io/projected/ab11b944-7857-4998-b32b-264ac7683616-kube-api-access-w7trk\") pod \"ab11b944-7857-4998-b32b-264ac7683616\" (UID: \"ab11b944-7857-4998-b32b-264ac7683616\") " Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.459088 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab11b944-7857-4998-b32b-264ac7683616-kube-api-access-w7trk" (OuterVolumeSpecName: "kube-api-access-w7trk") pod "ab11b944-7857-4998-b32b-264ac7683616" (UID: "ab11b944-7857-4998-b32b-264ac7683616"). InnerVolumeSpecName "kube-api-access-w7trk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.554427 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7trk\" (UniqueName: \"kubernetes.io/projected/ab11b944-7857-4998-b32b-264ac7683616-kube-api-access-w7trk\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.656472 4764 scope.go:117] "RemoveContainer" containerID="8419a9151f23a452d93428da7d46b2dd5d9147864269035236c144959618c6a4" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.709567 4764 scope.go:117] "RemoveContainer" containerID="20dab72734e5903795d478bac44970ac57f0690f8f454dee9dae223fbd3e92ac" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.872801 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551068-v6md5" event={"ID":"ab11b944-7857-4998-b32b-264ac7683616","Type":"ContainerDied","Data":"f126d2e1b174f599134d0ceb8f1afe3dc279a52c5b8d7026721c430819068c70"} Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.872862 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551068-v6md5" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.872880 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f126d2e1b174f599134d0ceb8f1afe3dc279a52c5b8d7026721c430819068c70" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.930269 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551062-wl7gf"] Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.932995 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.939271 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551062-wl7gf"] Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.994187 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nxs42"] Mar 09 13:48:07 crc kubenswrapper[4764]: I0309 13:48:07.572666 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61f41fcc-bb74-4c90-a6af-bfcd168ef2cb" path="/var/lib/kubelet/pods/61f41fcc-bb74-4c90-a6af-bfcd168ef2cb/volumes" Mar 09 13:48:07 crc kubenswrapper[4764]: I0309 13:48:07.893532 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nxs42" podUID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerName="registry-server" containerID="cri-o://06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573" gracePeriod=2 Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.361882 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.517306 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-utilities\") pod \"1852f78c-18c6-481e-bf04-c3eba97b11e7\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.517496 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6gwd\" (UniqueName: \"kubernetes.io/projected/1852f78c-18c6-481e-bf04-c3eba97b11e7-kube-api-access-q6gwd\") pod \"1852f78c-18c6-481e-bf04-c3eba97b11e7\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.517610 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-catalog-content\") pod \"1852f78c-18c6-481e-bf04-c3eba97b11e7\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.521156 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-utilities" (OuterVolumeSpecName: "utilities") pod "1852f78c-18c6-481e-bf04-c3eba97b11e7" (UID: "1852f78c-18c6-481e-bf04-c3eba97b11e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.532021 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1852f78c-18c6-481e-bf04-c3eba97b11e7-kube-api-access-q6gwd" (OuterVolumeSpecName: "kube-api-access-q6gwd") pod "1852f78c-18c6-481e-bf04-c3eba97b11e7" (UID: "1852f78c-18c6-481e-bf04-c3eba97b11e7"). InnerVolumeSpecName "kube-api-access-q6gwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.604839 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1852f78c-18c6-481e-bf04-c3eba97b11e7" (UID: "1852f78c-18c6-481e-bf04-c3eba97b11e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.621078 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6gwd\" (UniqueName: \"kubernetes.io/projected/1852f78c-18c6-481e-bf04-c3eba97b11e7-kube-api-access-q6gwd\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.621116 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.621127 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.907304 4764 generic.go:334] "Generic (PLEG): container finished" podID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerID="06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573" exitCode=0 Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.907842 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxs42" event={"ID":"1852f78c-18c6-481e-bf04-c3eba97b11e7","Type":"ContainerDied","Data":"06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573"} Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.907890 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxs42" event={"ID":"1852f78c-18c6-481e-bf04-c3eba97b11e7","Type":"ContainerDied","Data":"296edd151f10a4f45fecda1a67c35226a5a35655cc126212c50ad827e9d7aed5"} Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.907920 4764 scope.go:117] "RemoveContainer" containerID="06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573" Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.908157 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.967104 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nxs42"] Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.967965 4764 scope.go:117] "RemoveContainer" containerID="379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85" Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.977269 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nxs42"] Mar 09 13:48:09 crc kubenswrapper[4764]: I0309 13:48:09.008429 4764 scope.go:117] "RemoveContainer" containerID="be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147" Mar 09 13:48:09 crc kubenswrapper[4764]: I0309 13:48:09.041266 4764 scope.go:117] "RemoveContainer" containerID="06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573" Mar 09 13:48:09 crc kubenswrapper[4764]: E0309 13:48:09.041779 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573\": container with ID starting with 06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573 not found: ID does not exist" containerID="06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573" Mar 09 13:48:09 crc kubenswrapper[4764]: I0309 13:48:09.041830 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573"} err="failed to get container status \"06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573\": rpc error: code = NotFound desc = could not find container \"06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573\": container with ID starting with 06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573 not found: ID does not exist" Mar 09 13:48:09 crc kubenswrapper[4764]: I0309 13:48:09.041864 4764 scope.go:117] "RemoveContainer" containerID="379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85" Mar 09 13:48:09 crc kubenswrapper[4764]: E0309 13:48:09.042133 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85\": container with ID starting with 379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85 not found: ID does not exist" containerID="379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85" Mar 09 13:48:09 crc kubenswrapper[4764]: I0309 13:48:09.042168 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85"} err="failed to get container status \"379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85\": rpc error: code = NotFound desc = could not find container \"379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85\": container with ID starting with 379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85 not found: ID does not exist" Mar 09 13:48:09 crc kubenswrapper[4764]: I0309 13:48:09.042190 4764 scope.go:117] "RemoveContainer" containerID="be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147" Mar 09 13:48:09 crc kubenswrapper[4764]: E0309 13:48:09.042410 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147\": container with ID starting with be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147 not found: ID does not exist" containerID="be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147" Mar 09 13:48:09 crc kubenswrapper[4764]: I0309 13:48:09.042442 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147"} err="failed to get container status \"be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147\": rpc error: code = NotFound desc = could not find container \"be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147\": container with ID starting with be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147 not found: ID does not exist" Mar 09 13:48:09 crc kubenswrapper[4764]: I0309 13:48:09.600015 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1852f78c-18c6-481e-bf04-c3eba97b11e7" path="/var/lib/kubelet/pods/1852f78c-18c6-481e-bf04-c3eba97b11e7/volumes" Mar 09 13:48:10 crc kubenswrapper[4764]: I0309 13:48:10.560335 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:48:10 crc kubenswrapper[4764]: E0309 13:48:10.561260 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.481449 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5j7k8"] Mar 09 13:48:21 crc kubenswrapper[4764]: E0309 13:48:21.482969 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab11b944-7857-4998-b32b-264ac7683616" containerName="oc" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.482993 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab11b944-7857-4998-b32b-264ac7683616" containerName="oc" Mar 09 13:48:21 crc kubenswrapper[4764]: E0309 13:48:21.483010 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerName="registry-server" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.483018 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerName="registry-server" Mar 09 13:48:21 crc kubenswrapper[4764]: E0309 13:48:21.483040 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerName="extract-content" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.483048 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerName="extract-content" Mar 09 13:48:21 crc kubenswrapper[4764]: E0309 13:48:21.483086 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerName="extract-utilities" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.483095 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerName="extract-utilities" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.483332 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerName="registry-server" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.483356 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab11b944-7857-4998-b32b-264ac7683616" containerName="oc" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.485210 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.500695 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5j7k8"] Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.543779 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqhfj\" (UniqueName: \"kubernetes.io/projected/4752ae6f-41a1-4958-a438-d02f33f433b9-kube-api-access-dqhfj\") pod \"redhat-marketplace-5j7k8\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.543970 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-utilities\") pod \"redhat-marketplace-5j7k8\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.544025 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-catalog-content\") pod \"redhat-marketplace-5j7k8\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.560075 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:48:21 crc kubenswrapper[4764]: E0309 13:48:21.560413 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.651350 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-utilities\") pod \"redhat-marketplace-5j7k8\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.651450 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-catalog-content\") pod \"redhat-marketplace-5j7k8\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.651532 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqhfj\" (UniqueName: \"kubernetes.io/projected/4752ae6f-41a1-4958-a438-d02f33f433b9-kube-api-access-dqhfj\") pod \"redhat-marketplace-5j7k8\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.652711 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-catalog-content\") pod \"redhat-marketplace-5j7k8\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.652706 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-utilities\") pod \"redhat-marketplace-5j7k8\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.675178 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqhfj\" (UniqueName: \"kubernetes.io/projected/4752ae6f-41a1-4958-a438-d02f33f433b9-kube-api-access-dqhfj\") pod \"redhat-marketplace-5j7k8\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.814840 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:22 crc kubenswrapper[4764]: I0309 13:48:22.315635 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5j7k8"] Mar 09 13:48:23 crc kubenswrapper[4764]: I0309 13:48:23.057222 4764 generic.go:334] "Generic (PLEG): container finished" podID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerID="e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c" exitCode=0 Mar 09 13:48:23 crc kubenswrapper[4764]: I0309 13:48:23.059108 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5j7k8" event={"ID":"4752ae6f-41a1-4958-a438-d02f33f433b9","Type":"ContainerDied","Data":"e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c"} Mar 09 13:48:23 crc kubenswrapper[4764]: I0309 13:48:23.059228 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5j7k8" event={"ID":"4752ae6f-41a1-4958-a438-d02f33f433b9","Type":"ContainerStarted","Data":"cdfbd1bebce105b4241f3b0f024b996f4c76295bcaa2064aa1ca4accccba2294"} Mar 09 13:48:24 crc kubenswrapper[4764]: I0309 13:48:24.072441 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5j7k8" event={"ID":"4752ae6f-41a1-4958-a438-d02f33f433b9","Type":"ContainerStarted","Data":"60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a"} Mar 09 13:48:25 crc kubenswrapper[4764]: I0309 13:48:25.084386 4764 generic.go:334] "Generic (PLEG): container finished" podID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerID="60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a" exitCode=0 Mar 09 13:48:25 crc kubenswrapper[4764]: I0309 13:48:25.084448 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5j7k8" event={"ID":"4752ae6f-41a1-4958-a438-d02f33f433b9","Type":"ContainerDied","Data":"60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a"} Mar 09 13:48:26 crc kubenswrapper[4764]: I0309 13:48:26.097083 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5j7k8" event={"ID":"4752ae6f-41a1-4958-a438-d02f33f433b9","Type":"ContainerStarted","Data":"2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816"} Mar 09 13:48:26 crc kubenswrapper[4764]: I0309 13:48:26.125009 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5j7k8" podStartSLOduration=2.49528295 podStartE2EDuration="5.124980191s" podCreationTimestamp="2026-03-09 13:48:21 +0000 UTC" firstStartedPulling="2026-03-09 13:48:23.062090989 +0000 UTC m=+1658.312262897" lastFinishedPulling="2026-03-09 13:48:25.69178823 +0000 UTC m=+1660.941960138" observedRunningTime="2026-03-09 13:48:26.121449027 +0000 UTC m=+1661.371620955" watchObservedRunningTime="2026-03-09 13:48:26.124980191 +0000 UTC m=+1661.375152099" Mar 09 13:48:31 crc kubenswrapper[4764]: I0309 13:48:31.815281 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:31 crc kubenswrapper[4764]: I0309 13:48:31.816173 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:31 crc kubenswrapper[4764]: I0309 13:48:31.872159 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:32 crc kubenswrapper[4764]: I0309 13:48:32.216145 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:32 crc kubenswrapper[4764]: I0309 13:48:32.753303 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5j7k8"] Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.189232 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5j7k8" podUID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerName="registry-server" containerID="cri-o://2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816" gracePeriod=2 Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.676400 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.775569 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-catalog-content\") pod \"4752ae6f-41a1-4958-a438-d02f33f433b9\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.775698 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-utilities\") pod \"4752ae6f-41a1-4958-a438-d02f33f433b9\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.775895 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqhfj\" (UniqueName: \"kubernetes.io/projected/4752ae6f-41a1-4958-a438-d02f33f433b9-kube-api-access-dqhfj\") pod \"4752ae6f-41a1-4958-a438-d02f33f433b9\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.776815 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-utilities" (OuterVolumeSpecName: "utilities") pod "4752ae6f-41a1-4958-a438-d02f33f433b9" (UID: "4752ae6f-41a1-4958-a438-d02f33f433b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.784595 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4752ae6f-41a1-4958-a438-d02f33f433b9-kube-api-access-dqhfj" (OuterVolumeSpecName: "kube-api-access-dqhfj") pod "4752ae6f-41a1-4958-a438-d02f33f433b9" (UID: "4752ae6f-41a1-4958-a438-d02f33f433b9"). InnerVolumeSpecName "kube-api-access-dqhfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.813247 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4752ae6f-41a1-4958-a438-d02f33f433b9" (UID: "4752ae6f-41a1-4958-a438-d02f33f433b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.879210 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.879265 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.879342 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqhfj\" (UniqueName: \"kubernetes.io/projected/4752ae6f-41a1-4958-a438-d02f33f433b9-kube-api-access-dqhfj\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.206360 4764 generic.go:334] "Generic (PLEG): container finished" podID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerID="2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816" exitCode=0 Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.206430 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5j7k8" event={"ID":"4752ae6f-41a1-4958-a438-d02f33f433b9","Type":"ContainerDied","Data":"2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816"} Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.206486 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5j7k8" event={"ID":"4752ae6f-41a1-4958-a438-d02f33f433b9","Type":"ContainerDied","Data":"cdfbd1bebce105b4241f3b0f024b996f4c76295bcaa2064aa1ca4accccba2294"} Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.206505 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.206523 4764 scope.go:117] "RemoveContainer" containerID="2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.252993 4764 scope.go:117] "RemoveContainer" containerID="60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.271715 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5j7k8"] Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.280423 4764 scope.go:117] "RemoveContainer" containerID="e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.286439 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5j7k8"] Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.322960 4764 scope.go:117] "RemoveContainer" containerID="2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816" Mar 09 13:48:35 crc kubenswrapper[4764]: E0309 13:48:35.323533 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816\": container with ID starting with 2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816 not found: ID does not exist" containerID="2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.323576 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816"} err="failed to get container status \"2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816\": rpc error: code = NotFound desc = could not find container \"2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816\": container with ID starting with 2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816 not found: ID does not exist" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.323606 4764 scope.go:117] "RemoveContainer" containerID="60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a" Mar 09 13:48:35 crc kubenswrapper[4764]: E0309 13:48:35.324139 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a\": container with ID starting with 60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a not found: ID does not exist" containerID="60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.324171 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a"} err="failed to get container status \"60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a\": rpc error: code = NotFound desc = could not find container \"60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a\": container with ID starting with 60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a not found: ID does not exist" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.324188 4764 scope.go:117] "RemoveContainer" containerID="e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c" Mar 09 13:48:35 crc kubenswrapper[4764]: E0309 13:48:35.324628 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c\": container with ID starting with e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c not found: ID does not exist" containerID="e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.324722 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c"} err="failed to get container status \"e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c\": rpc error: code = NotFound desc = could not find container \"e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c\": container with ID starting with e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c not found: ID does not exist" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.571970 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:48:35 crc kubenswrapper[4764]: E0309 13:48:35.572467 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.579541 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4752ae6f-41a1-4958-a438-d02f33f433b9" path="/var/lib/kubelet/pods/4752ae6f-41a1-4958-a438-d02f33f433b9/volumes" Mar 09 13:48:36 crc kubenswrapper[4764]: I0309 13:48:36.218817 4764 generic.go:334] "Generic (PLEG): container finished" podID="fe0d9990-083b-428b-baec-a40ae99487db" containerID="6e4f8f0feb8ec9ecf660634549716b0c350cba173b98c033e4c7e09aa6bd108d" exitCode=0 Mar 09 13:48:36 crc kubenswrapper[4764]: I0309 13:48:36.218893 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" event={"ID":"fe0d9990-083b-428b-baec-a40ae99487db","Type":"ContainerDied","Data":"6e4f8f0feb8ec9ecf660634549716b0c350cba173b98c033e4c7e09aa6bd108d"} Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.685200 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.846149 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-bootstrap-combined-ca-bundle\") pod \"fe0d9990-083b-428b-baec-a40ae99487db\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.846206 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrpw2\" (UniqueName: \"kubernetes.io/projected/fe0d9990-083b-428b-baec-a40ae99487db-kube-api-access-rrpw2\") pod \"fe0d9990-083b-428b-baec-a40ae99487db\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.846289 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-ssh-key-openstack-edpm-ipam\") pod \"fe0d9990-083b-428b-baec-a40ae99487db\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.846557 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-inventory\") pod \"fe0d9990-083b-428b-baec-a40ae99487db\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.854580 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "fe0d9990-083b-428b-baec-a40ae99487db" (UID: "fe0d9990-083b-428b-baec-a40ae99487db"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.855094 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe0d9990-083b-428b-baec-a40ae99487db-kube-api-access-rrpw2" (OuterVolumeSpecName: "kube-api-access-rrpw2") pod "fe0d9990-083b-428b-baec-a40ae99487db" (UID: "fe0d9990-083b-428b-baec-a40ae99487db"). InnerVolumeSpecName "kube-api-access-rrpw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.877541 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-inventory" (OuterVolumeSpecName: "inventory") pod "fe0d9990-083b-428b-baec-a40ae99487db" (UID: "fe0d9990-083b-428b-baec-a40ae99487db"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.884100 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fe0d9990-083b-428b-baec-a40ae99487db" (UID: "fe0d9990-083b-428b-baec-a40ae99487db"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.949448 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.949509 4764 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.949525 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrpw2\" (UniqueName: \"kubernetes.io/projected/fe0d9990-083b-428b-baec-a40ae99487db-kube-api-access-rrpw2\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.949538 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.243911 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" event={"ID":"fe0d9990-083b-428b-baec-a40ae99487db","Type":"ContainerDied","Data":"88f1f6de4e55f98cf3fb91581e4ad15ec69d81b47c69d9187bda289cfb67da48"} Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.244461 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88f1f6de4e55f98cf3fb91581e4ad15ec69d81b47c69d9187bda289cfb67da48" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.244266 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.341848 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q"] Mar 09 13:48:38 crc kubenswrapper[4764]: E0309 13:48:38.342471 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerName="registry-server" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.342503 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerName="registry-server" Mar 09 13:48:38 crc kubenswrapper[4764]: E0309 13:48:38.342529 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerName="extract-content" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.342539 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerName="extract-content" Mar 09 13:48:38 crc kubenswrapper[4764]: E0309 13:48:38.342572 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0d9990-083b-428b-baec-a40ae99487db" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.342586 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0d9990-083b-428b-baec-a40ae99487db" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 13:48:38 crc kubenswrapper[4764]: E0309 13:48:38.342603 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerName="extract-utilities" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.342612 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerName="extract-utilities" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.342807 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerName="registry-server" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.342831 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0d9990-083b-428b-baec-a40ae99487db" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.343637 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.349364 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.349793 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.349966 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.350077 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.354415 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q"] Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.461632 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.461821 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9cnl\" (UniqueName: \"kubernetes.io/projected/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-kube-api-access-h9cnl\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.461909 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.565860 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.566089 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.566184 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9cnl\" (UniqueName: \"kubernetes.io/projected/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-kube-api-access-h9cnl\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.571790 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.574905 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.592628 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9cnl\" (UniqueName: \"kubernetes.io/projected/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-kube-api-access-h9cnl\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.666180 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:39 crc kubenswrapper[4764]: I0309 13:48:39.235394 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q"] Mar 09 13:48:39 crc kubenswrapper[4764]: I0309 13:48:39.260098 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" event={"ID":"85b3887c-ea0d-4ca0-a862-0134f0ae08b5","Type":"ContainerStarted","Data":"57eae1f53cc5fcb5ec3680529144f8d305dbe72b97f96900aa48da343bdb99f9"} Mar 09 13:48:41 crc kubenswrapper[4764]: I0309 13:48:41.286912 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" event={"ID":"85b3887c-ea0d-4ca0-a862-0134f0ae08b5","Type":"ContainerStarted","Data":"8097883da308046a256f53d8042acf3d3dddd33e51cc0f93ed512385c36b57c0"} Mar 09 13:48:41 crc kubenswrapper[4764]: I0309 13:48:41.316364 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" podStartSLOduration=2.5255194530000002 podStartE2EDuration="3.316310071s" podCreationTimestamp="2026-03-09 13:48:38 +0000 UTC" firstStartedPulling="2026-03-09 13:48:39.24884586 +0000 UTC m=+1674.499017768" lastFinishedPulling="2026-03-09 13:48:40.039636468 +0000 UTC m=+1675.289808386" observedRunningTime="2026-03-09 13:48:41.306701245 +0000 UTC m=+1676.556873183" watchObservedRunningTime="2026-03-09 13:48:41.316310071 +0000 UTC m=+1676.566481979" Mar 09 13:48:46 crc kubenswrapper[4764]: I0309 13:48:46.560188 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:48:46 crc kubenswrapper[4764]: E0309 13:48:46.561379 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:48:59 crc kubenswrapper[4764]: I0309 13:48:59.560682 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:48:59 crc kubenswrapper[4764]: E0309 13:48:59.561849 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:49:05 crc kubenswrapper[4764]: I0309 13:49:05.786881 4764 scope.go:117] "RemoveContainer" containerID="81d1407cfee85c86ab81cdf9ec86e68979d06f5a586c4a54b9f7e6a537e80948" Mar 09 13:49:05 crc kubenswrapper[4764]: I0309 13:49:05.810595 4764 scope.go:117] "RemoveContainer" containerID="eb86255e700a4d6357cce0dc3140544f102a32bacc79bd2992bea6bc6385fe75" Mar 09 13:49:05 crc kubenswrapper[4764]: I0309 13:49:05.832542 4764 scope.go:117] "RemoveContainer" containerID="9f52cd378552f4424230b7014d94477e188c2eb48ce843f7df141a1298245bd6" Mar 09 13:49:14 crc kubenswrapper[4764]: I0309 13:49:14.560299 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:49:14 crc kubenswrapper[4764]: E0309 13:49:14.561336 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:49:25 crc kubenswrapper[4764]: I0309 13:49:25.560617 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:49:25 crc kubenswrapper[4764]: E0309 13:49:25.561728 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:49:37 crc kubenswrapper[4764]: I0309 13:49:37.560285 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:49:37 crc kubenswrapper[4764]: E0309 13:49:37.561370 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:49:47 crc kubenswrapper[4764]: I0309 13:49:47.983230 4764 generic.go:334] "Generic (PLEG): container finished" podID="85b3887c-ea0d-4ca0-a862-0134f0ae08b5" containerID="8097883da308046a256f53d8042acf3d3dddd33e51cc0f93ed512385c36b57c0" exitCode=0 Mar 09 13:49:47 crc kubenswrapper[4764]: I0309 13:49:47.983352 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" event={"ID":"85b3887c-ea0d-4ca0-a862-0134f0ae08b5","Type":"ContainerDied","Data":"8097883da308046a256f53d8042acf3d3dddd33e51cc0f93ed512385c36b57c0"} Mar 09 13:49:49 crc kubenswrapper[4764]: I0309 13:49:49.443095 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:49:49 crc kubenswrapper[4764]: I0309 13:49:49.567314 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9cnl\" (UniqueName: \"kubernetes.io/projected/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-kube-api-access-h9cnl\") pod \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " Mar 09 13:49:49 crc kubenswrapper[4764]: I0309 13:49:49.568253 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-ssh-key-openstack-edpm-ipam\") pod \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " Mar 09 13:49:49 crc kubenswrapper[4764]: I0309 13:49:49.568303 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-inventory\") pod \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " Mar 09 13:49:49 crc kubenswrapper[4764]: I0309 13:49:49.580025 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-kube-api-access-h9cnl" (OuterVolumeSpecName: "kube-api-access-h9cnl") pod "85b3887c-ea0d-4ca0-a862-0134f0ae08b5" (UID: "85b3887c-ea0d-4ca0-a862-0134f0ae08b5"). InnerVolumeSpecName "kube-api-access-h9cnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:49:49 crc kubenswrapper[4764]: I0309 13:49:49.597965 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "85b3887c-ea0d-4ca0-a862-0134f0ae08b5" (UID: "85b3887c-ea0d-4ca0-a862-0134f0ae08b5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:49:49 crc kubenswrapper[4764]: I0309 13:49:49.605313 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-inventory" (OuterVolumeSpecName: "inventory") pod "85b3887c-ea0d-4ca0-a862-0134f0ae08b5" (UID: "85b3887c-ea0d-4ca0-a862-0134f0ae08b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:49:49 crc kubenswrapper[4764]: I0309 13:49:49.671736 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:49:49 crc kubenswrapper[4764]: I0309 13:49:49.671803 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:49:49 crc kubenswrapper[4764]: I0309 13:49:49.671823 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9cnl\" (UniqueName: \"kubernetes.io/projected/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-kube-api-access-h9cnl\") on node \"crc\" DevicePath \"\"" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.009601 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" event={"ID":"85b3887c-ea0d-4ca0-a862-0134f0ae08b5","Type":"ContainerDied","Data":"57eae1f53cc5fcb5ec3680529144f8d305dbe72b97f96900aa48da343bdb99f9"} Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.009682 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57eae1f53cc5fcb5ec3680529144f8d305dbe72b97f96900aa48da343bdb99f9" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.010021 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.142926 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2"] Mar 09 13:49:50 crc kubenswrapper[4764]: E0309 13:49:50.143523 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85b3887c-ea0d-4ca0-a862-0134f0ae08b5" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.143550 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="85b3887c-ea0d-4ca0-a862-0134f0ae08b5" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.149917 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="85b3887c-ea0d-4ca0-a862-0134f0ae08b5" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.150925 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.153891 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.154385 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.162283 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.162711 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.191252 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2"] Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.285970 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g84h2\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.286086 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g84h2\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.286126 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlf82\" (UniqueName: \"kubernetes.io/projected/38445f30-348d-4c11-94c5-81bca885cc36-kube-api-access-tlf82\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g84h2\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.388769 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g84h2\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.388951 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g84h2\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.389015 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlf82\" (UniqueName: \"kubernetes.io/projected/38445f30-348d-4c11-94c5-81bca885cc36-kube-api-access-tlf82\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g84h2\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.394224 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g84h2\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.395549 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g84h2\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.410317 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlf82\" (UniqueName: \"kubernetes.io/projected/38445f30-348d-4c11-94c5-81bca885cc36-kube-api-access-tlf82\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g84h2\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.481337 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:51 crc kubenswrapper[4764]: I0309 13:49:51.040854 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2"] Mar 09 13:49:51 crc kubenswrapper[4764]: I0309 13:49:51.560464 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:49:51 crc kubenswrapper[4764]: E0309 13:49:51.561373 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:49:52 crc kubenswrapper[4764]: I0309 13:49:52.032294 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" event={"ID":"38445f30-348d-4c11-94c5-81bca885cc36","Type":"ContainerStarted","Data":"f0aca49c15e8c97fb96eb53f9577a50ff9c6c25e52f97ab9fc13ad081c1cd506"} Mar 09 13:49:52 crc kubenswrapper[4764]: I0309 13:49:52.032856 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" event={"ID":"38445f30-348d-4c11-94c5-81bca885cc36","Type":"ContainerStarted","Data":"31a768dc42ab731fb41bb76184444fdc0ca9a4645ca28028647c2c27cf959d03"} Mar 09 13:49:52 crc kubenswrapper[4764]: I0309 13:49:52.061733 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" podStartSLOduration=1.632767631 podStartE2EDuration="2.061701388s" podCreationTimestamp="2026-03-09 13:49:50 +0000 UTC" firstStartedPulling="2026-03-09 13:49:51.045402218 +0000 UTC m=+1746.295574136" lastFinishedPulling="2026-03-09 13:49:51.474335965 +0000 UTC m=+1746.724507893" observedRunningTime="2026-03-09 13:49:52.04677134 +0000 UTC m=+1747.296943288" watchObservedRunningTime="2026-03-09 13:49:52.061701388 +0000 UTC m=+1747.311873336" Mar 09 13:49:57 crc kubenswrapper[4764]: I0309 13:49:57.094173 4764 generic.go:334] "Generic (PLEG): container finished" podID="38445f30-348d-4c11-94c5-81bca885cc36" containerID="f0aca49c15e8c97fb96eb53f9577a50ff9c6c25e52f97ab9fc13ad081c1cd506" exitCode=0 Mar 09 13:49:57 crc kubenswrapper[4764]: I0309 13:49:57.094290 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" event={"ID":"38445f30-348d-4c11-94c5-81bca885cc36","Type":"ContainerDied","Data":"f0aca49c15e8c97fb96eb53f9577a50ff9c6c25e52f97ab9fc13ad081c1cd506"} Mar 09 13:49:58 crc kubenswrapper[4764]: I0309 13:49:58.527431 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:58 crc kubenswrapper[4764]: I0309 13:49:58.688030 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-ssh-key-openstack-edpm-ipam\") pod \"38445f30-348d-4c11-94c5-81bca885cc36\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " Mar 09 13:49:58 crc kubenswrapper[4764]: I0309 13:49:58.688585 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-inventory\") pod \"38445f30-348d-4c11-94c5-81bca885cc36\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " Mar 09 13:49:58 crc kubenswrapper[4764]: I0309 13:49:58.688745 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlf82\" (UniqueName: \"kubernetes.io/projected/38445f30-348d-4c11-94c5-81bca885cc36-kube-api-access-tlf82\") pod \"38445f30-348d-4c11-94c5-81bca885cc36\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " Mar 09 13:49:58 crc kubenswrapper[4764]: I0309 13:49:58.704156 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38445f30-348d-4c11-94c5-81bca885cc36-kube-api-access-tlf82" (OuterVolumeSpecName: "kube-api-access-tlf82") pod "38445f30-348d-4c11-94c5-81bca885cc36" (UID: "38445f30-348d-4c11-94c5-81bca885cc36"). InnerVolumeSpecName "kube-api-access-tlf82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:49:58 crc kubenswrapper[4764]: I0309 13:49:58.718003 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "38445f30-348d-4c11-94c5-81bca885cc36" (UID: "38445f30-348d-4c11-94c5-81bca885cc36"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:49:58 crc kubenswrapper[4764]: I0309 13:49:58.723834 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-inventory" (OuterVolumeSpecName: "inventory") pod "38445f30-348d-4c11-94c5-81bca885cc36" (UID: "38445f30-348d-4c11-94c5-81bca885cc36"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:49:58 crc kubenswrapper[4764]: I0309 13:49:58.791912 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:49:58 crc kubenswrapper[4764]: I0309 13:49:58.791956 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:49:58 crc kubenswrapper[4764]: I0309 13:49:58.791965 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlf82\" (UniqueName: \"kubernetes.io/projected/38445f30-348d-4c11-94c5-81bca885cc36-kube-api-access-tlf82\") on node \"crc\" DevicePath \"\"" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.125390 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" event={"ID":"38445f30-348d-4c11-94c5-81bca885cc36","Type":"ContainerDied","Data":"31a768dc42ab731fb41bb76184444fdc0ca9a4645ca28028647c2c27cf959d03"} Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.125464 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31a768dc42ab731fb41bb76184444fdc0ca9a4645ca28028647c2c27cf959d03" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.125484 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.211809 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg"] Mar 09 13:49:59 crc kubenswrapper[4764]: E0309 13:49:59.212499 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38445f30-348d-4c11-94c5-81bca885cc36" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.212526 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="38445f30-348d-4c11-94c5-81bca885cc36" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.212841 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="38445f30-348d-4c11-94c5-81bca885cc36" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.213780 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.216307 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.216736 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.217194 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.217388 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.230367 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg"] Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.305599 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j2fzg\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.305833 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j2fzg\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.305865 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7snl\" (UniqueName: \"kubernetes.io/projected/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-kube-api-access-k7snl\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j2fzg\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.407716 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j2fzg\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.407761 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7snl\" (UniqueName: \"kubernetes.io/projected/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-kube-api-access-k7snl\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j2fzg\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.407851 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j2fzg\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.414350 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j2fzg\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.414561 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j2fzg\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.428252 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7snl\" (UniqueName: \"kubernetes.io/projected/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-kube-api-access-k7snl\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j2fzg\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.547808 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.106971 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg"] Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.139846 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" event={"ID":"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add","Type":"ContainerStarted","Data":"63d45c8e8448e981e4b409a90ffed5471ff40c994d9269a761fcbf3a1ba24c89"} Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.154038 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551070-x977g"] Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.155877 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551070-x977g" Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.166638 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551070-x977g"] Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.193332 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.193689 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.195816 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.327206 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kq22\" (UniqueName: \"kubernetes.io/projected/14ef871d-e371-41df-9380-53505557d7ac-kube-api-access-7kq22\") pod \"auto-csr-approver-29551070-x977g\" (UID: \"14ef871d-e371-41df-9380-53505557d7ac\") " pod="openshift-infra/auto-csr-approver-29551070-x977g" Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.430655 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kq22\" (UniqueName: \"kubernetes.io/projected/14ef871d-e371-41df-9380-53505557d7ac-kube-api-access-7kq22\") pod \"auto-csr-approver-29551070-x977g\" (UID: \"14ef871d-e371-41df-9380-53505557d7ac\") " pod="openshift-infra/auto-csr-approver-29551070-x977g" Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.453210 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kq22\" (UniqueName: \"kubernetes.io/projected/14ef871d-e371-41df-9380-53505557d7ac-kube-api-access-7kq22\") pod \"auto-csr-approver-29551070-x977g\" (UID: \"14ef871d-e371-41df-9380-53505557d7ac\") " pod="openshift-infra/auto-csr-approver-29551070-x977g" Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.537579 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551070-x977g" Mar 09 13:50:01 crc kubenswrapper[4764]: I0309 13:50:01.016652 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551070-x977g"] Mar 09 13:50:01 crc kubenswrapper[4764]: W0309 13:50:01.020180 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14ef871d_e371_41df_9380_53505557d7ac.slice/crio-669db91d9f1fcc1767a7b1f486e235588479de39ae56a765a944be467e990bb9 WatchSource:0}: Error finding container 669db91d9f1fcc1767a7b1f486e235588479de39ae56a765a944be467e990bb9: Status 404 returned error can't find the container with id 669db91d9f1fcc1767a7b1f486e235588479de39ae56a765a944be467e990bb9 Mar 09 13:50:01 crc kubenswrapper[4764]: I0309 13:50:01.151655 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551070-x977g" event={"ID":"14ef871d-e371-41df-9380-53505557d7ac","Type":"ContainerStarted","Data":"669db91d9f1fcc1767a7b1f486e235588479de39ae56a765a944be467e990bb9"} Mar 09 13:50:01 crc kubenswrapper[4764]: I0309 13:50:01.153867 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" event={"ID":"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add","Type":"ContainerStarted","Data":"1c7c071cdcf2595562a14fc8211f12dd741cead95f0450c54862b35cde78ac5b"} Mar 09 13:50:01 crc kubenswrapper[4764]: I0309 13:50:01.185411 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" podStartSLOduration=1.609421414 podStartE2EDuration="2.185376753s" podCreationTimestamp="2026-03-09 13:49:59 +0000 UTC" firstStartedPulling="2026-03-09 13:50:00.122509801 +0000 UTC m=+1755.372681709" lastFinishedPulling="2026-03-09 13:50:00.69846514 +0000 UTC m=+1755.948637048" observedRunningTime="2026-03-09 13:50:01.17699197 +0000 UTC m=+1756.427163898" watchObservedRunningTime="2026-03-09 13:50:01.185376753 +0000 UTC m=+1756.435548671" Mar 09 13:50:02 crc kubenswrapper[4764]: I0309 13:50:02.054559 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-d7e9-account-create-update-n7gsb"] Mar 09 13:50:02 crc kubenswrapper[4764]: I0309 13:50:02.068947 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-d7e9-account-create-update-n7gsb"] Mar 09 13:50:03 crc kubenswrapper[4764]: I0309 13:50:03.038811 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-kn2lh"] Mar 09 13:50:03 crc kubenswrapper[4764]: I0309 13:50:03.052570 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-wkxnp"] Mar 09 13:50:03 crc kubenswrapper[4764]: I0309 13:50:03.061930 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-kn2lh"] Mar 09 13:50:03 crc kubenswrapper[4764]: I0309 13:50:03.070150 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-wkxnp"] Mar 09 13:50:03 crc kubenswrapper[4764]: I0309 13:50:03.182099 4764 generic.go:334] "Generic (PLEG): container finished" podID="14ef871d-e371-41df-9380-53505557d7ac" containerID="e8df8bc509784d8d529ce5673fdcec9ae8d5b4cbcf9d86d0b27b355b5bffd393" exitCode=0 Mar 09 13:50:03 crc kubenswrapper[4764]: I0309 13:50:03.182160 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551070-x977g" event={"ID":"14ef871d-e371-41df-9380-53505557d7ac","Type":"ContainerDied","Data":"e8df8bc509784d8d529ce5673fdcec9ae8d5b4cbcf9d86d0b27b355b5bffd393"} Mar 09 13:50:03 crc kubenswrapper[4764]: I0309 13:50:03.560175 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:50:03 crc kubenswrapper[4764]: E0309 13:50:03.560468 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:50:03 crc kubenswrapper[4764]: I0309 13:50:03.591074 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="693ba99b-99d0-4b09-9f49-9deefe05abac" path="/var/lib/kubelet/pods/693ba99b-99d0-4b09-9f49-9deefe05abac/volumes" Mar 09 13:50:03 crc kubenswrapper[4764]: I0309 13:50:03.591826 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75f29150-3689-48a6-9248-b6774f85fcd2" path="/var/lib/kubelet/pods/75f29150-3689-48a6-9248-b6774f85fcd2/volumes" Mar 09 13:50:03 crc kubenswrapper[4764]: I0309 13:50:03.592464 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d681487-9af9-48e3-bb79-569b8c7bf26d" path="/var/lib/kubelet/pods/7d681487-9af9-48e3-bb79-569b8c7bf26d/volumes" Mar 09 13:50:04 crc kubenswrapper[4764]: I0309 13:50:04.040189 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-66ln9"] Mar 09 13:50:04 crc kubenswrapper[4764]: I0309 13:50:04.049855 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-594d-account-create-update-dxsw5"] Mar 09 13:50:04 crc kubenswrapper[4764]: I0309 13:50:04.061881 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-0f8b-account-create-update-mxbcn"] Mar 09 13:50:04 crc kubenswrapper[4764]: I0309 13:50:04.070901 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-0f8b-account-create-update-mxbcn"] Mar 09 13:50:04 crc kubenswrapper[4764]: I0309 13:50:04.078561 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-66ln9"] Mar 09 13:50:04 crc kubenswrapper[4764]: I0309 13:50:04.085426 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-594d-account-create-update-dxsw5"] Mar 09 13:50:04 crc kubenswrapper[4764]: I0309 13:50:04.537784 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551070-x977g" Mar 09 13:50:04 crc kubenswrapper[4764]: I0309 13:50:04.635236 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kq22\" (UniqueName: \"kubernetes.io/projected/14ef871d-e371-41df-9380-53505557d7ac-kube-api-access-7kq22\") pod \"14ef871d-e371-41df-9380-53505557d7ac\" (UID: \"14ef871d-e371-41df-9380-53505557d7ac\") " Mar 09 13:50:04 crc kubenswrapper[4764]: I0309 13:50:04.644100 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14ef871d-e371-41df-9380-53505557d7ac-kube-api-access-7kq22" (OuterVolumeSpecName: "kube-api-access-7kq22") pod "14ef871d-e371-41df-9380-53505557d7ac" (UID: "14ef871d-e371-41df-9380-53505557d7ac"). InnerVolumeSpecName "kube-api-access-7kq22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:50:04 crc kubenswrapper[4764]: I0309 13:50:04.738142 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kq22\" (UniqueName: \"kubernetes.io/projected/14ef871d-e371-41df-9380-53505557d7ac-kube-api-access-7kq22\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:05 crc kubenswrapper[4764]: I0309 13:50:05.207264 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551070-x977g" event={"ID":"14ef871d-e371-41df-9380-53505557d7ac","Type":"ContainerDied","Data":"669db91d9f1fcc1767a7b1f486e235588479de39ae56a765a944be467e990bb9"} Mar 09 13:50:05 crc kubenswrapper[4764]: I0309 13:50:05.207674 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="669db91d9f1fcc1767a7b1f486e235588479de39ae56a765a944be467e990bb9" Mar 09 13:50:05 crc kubenswrapper[4764]: I0309 13:50:05.207357 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551070-x977g" Mar 09 13:50:05 crc kubenswrapper[4764]: I0309 13:50:05.581617 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01e4dc90-6790-447b-ac2a-d2dfcde88d17" path="/var/lib/kubelet/pods/01e4dc90-6790-447b-ac2a-d2dfcde88d17/volumes" Mar 09 13:50:05 crc kubenswrapper[4764]: I0309 13:50:05.583226 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="811ef770-3be6-4f3b-9fc3-dee4df710c4f" path="/var/lib/kubelet/pods/811ef770-3be6-4f3b-9fc3-dee4df710c4f/volumes" Mar 09 13:50:05 crc kubenswrapper[4764]: I0309 13:50:05.585088 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d27c011-b8dd-4f14-9833-413f7a8faf8a" path="/var/lib/kubelet/pods/9d27c011-b8dd-4f14-9833-413f7a8faf8a/volumes" Mar 09 13:50:05 crc kubenswrapper[4764]: I0309 13:50:05.620997 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551064-mbrph"] Mar 09 13:50:05 crc kubenswrapper[4764]: I0309 13:50:05.632756 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551064-mbrph"] Mar 09 13:50:05 crc kubenswrapper[4764]: I0309 13:50:05.976810 4764 scope.go:117] "RemoveContainer" containerID="cb8ff00b99c398bb890b473678e6ce951f3d71cc55317946f469bc84cba9ae54" Mar 09 13:50:06 crc kubenswrapper[4764]: I0309 13:50:06.017490 4764 scope.go:117] "RemoveContainer" containerID="01ef1c726b7e7270c862ba5b9e31c73fecc7bf0ea188cc226f1bf52e6cb5af33" Mar 09 13:50:06 crc kubenswrapper[4764]: I0309 13:50:06.065153 4764 scope.go:117] "RemoveContainer" containerID="6bccf5846329079b445a150aae1cbe1d8637bb12a3644ba9afce7863cefc0fe8" Mar 09 13:50:06 crc kubenswrapper[4764]: I0309 13:50:06.115588 4764 scope.go:117] "RemoveContainer" containerID="98058d7a077d55dcc4dc3081744bc24f8ca7e82793695f8748e2850e19fbd5a3" Mar 09 13:50:06 crc kubenswrapper[4764]: I0309 13:50:06.162766 4764 scope.go:117] "RemoveContainer" containerID="da76ff68a5f6a4d7d50712be1138bbea55547749d69b4508eda52be21fd47ce6" Mar 09 13:50:06 crc kubenswrapper[4764]: I0309 13:50:06.208473 4764 scope.go:117] "RemoveContainer" containerID="6de1cfcabddc41034a2e1c58fb3ef2484991e0985876cbbdc822fe1df3641db1" Mar 09 13:50:07 crc kubenswrapper[4764]: I0309 13:50:07.571873 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77179ff3-861b-4aab-b1b2-db4d12041264" path="/var/lib/kubelet/pods/77179ff3-861b-4aab-b1b2-db4d12041264/volumes" Mar 09 13:50:14 crc kubenswrapper[4764]: I0309 13:50:14.560518 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:50:14 crc kubenswrapper[4764]: E0309 13:50:14.561573 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:50:25 crc kubenswrapper[4764]: I0309 13:50:25.052782 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-htpf8"] Mar 09 13:50:25 crc kubenswrapper[4764]: I0309 13:50:25.064978 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-htpf8"] Mar 09 13:50:25 crc kubenswrapper[4764]: I0309 13:50:25.575636 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88e0f4c9-1553-4aca-83f2-e0461ddf062b" path="/var/lib/kubelet/pods/88e0f4c9-1553-4aca-83f2-e0461ddf062b/volumes" Mar 09 13:50:26 crc kubenswrapper[4764]: I0309 13:50:26.559516 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:50:26 crc kubenswrapper[4764]: E0309 13:50:26.560214 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:50:31 crc kubenswrapper[4764]: I0309 13:50:31.039248 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-thhcb"] Mar 09 13:50:31 crc kubenswrapper[4764]: I0309 13:50:31.048735 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-mqv59"] Mar 09 13:50:31 crc kubenswrapper[4764]: I0309 13:50:31.082680 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-thhcb"] Mar 09 13:50:31 crc kubenswrapper[4764]: I0309 13:50:31.091900 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-15a9-account-create-update-5s8tj"] Mar 09 13:50:31 crc kubenswrapper[4764]: I0309 13:50:31.100868 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-mqv59"] Mar 09 13:50:31 crc kubenswrapper[4764]: I0309 13:50:31.111927 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-15a9-account-create-update-5s8tj"] Mar 09 13:50:31 crc kubenswrapper[4764]: I0309 13:50:31.570795 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46124175-b282-444f-8d9c-0397e35cf8ae" path="/var/lib/kubelet/pods/46124175-b282-444f-8d9c-0397e35cf8ae/volumes" Mar 09 13:50:31 crc kubenswrapper[4764]: I0309 13:50:31.571676 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="642b5df5-dec0-47cc-9595-02b254277452" path="/var/lib/kubelet/pods/642b5df5-dec0-47cc-9595-02b254277452/volumes" Mar 09 13:50:31 crc kubenswrapper[4764]: I0309 13:50:31.572437 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82410bc0-aa4c-450d-8fbc-67cfb9dd615b" path="/var/lib/kubelet/pods/82410bc0-aa4c-450d-8fbc-67cfb9dd615b/volumes" Mar 09 13:50:32 crc kubenswrapper[4764]: I0309 13:50:32.038527 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-a166-account-create-update-kswwc"] Mar 09 13:50:32 crc kubenswrapper[4764]: I0309 13:50:32.057521 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-gkf9g"] Mar 09 13:50:32 crc kubenswrapper[4764]: I0309 13:50:32.070006 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6d6f-account-create-update-qxm8j"] Mar 09 13:50:32 crc kubenswrapper[4764]: I0309 13:50:32.081806 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-gkf9g"] Mar 09 13:50:32 crc kubenswrapper[4764]: I0309 13:50:32.090999 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6d6f-account-create-update-qxm8j"] Mar 09 13:50:32 crc kubenswrapper[4764]: I0309 13:50:32.100059 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-a166-account-create-update-kswwc"] Mar 09 13:50:33 crc kubenswrapper[4764]: I0309 13:50:33.571595 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf72fda-56e5-427c-b2d0-8267613d8a9e" path="/var/lib/kubelet/pods/1bf72fda-56e5-427c-b2d0-8267613d8a9e/volumes" Mar 09 13:50:33 crc kubenswrapper[4764]: I0309 13:50:33.572573 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b86f9b8-6493-4a60-85b3-12057a6a8f65" path="/var/lib/kubelet/pods/5b86f9b8-6493-4a60-85b3-12057a6a8f65/volumes" Mar 09 13:50:33 crc kubenswrapper[4764]: I0309 13:50:33.574725 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad7d32c2-ffe4-43d5-8640-6219f863bc2a" path="/var/lib/kubelet/pods/ad7d32c2-ffe4-43d5-8640-6219f863bc2a/volumes" Mar 09 13:50:34 crc kubenswrapper[4764]: I0309 13:50:34.528933 4764 generic.go:334] "Generic (PLEG): container finished" podID="7ee15cfe-dd3c-4cc7-bf8f-b324397f4add" containerID="1c7c071cdcf2595562a14fc8211f12dd741cead95f0450c54862b35cde78ac5b" exitCode=0 Mar 09 13:50:34 crc kubenswrapper[4764]: I0309 13:50:34.529015 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" event={"ID":"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add","Type":"ContainerDied","Data":"1c7c071cdcf2595562a14fc8211f12dd741cead95f0450c54862b35cde78ac5b"} Mar 09 13:50:35 crc kubenswrapper[4764]: I0309 13:50:35.036048 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-vzxr2"] Mar 09 13:50:35 crc kubenswrapper[4764]: I0309 13:50:35.044339 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-vzxr2"] Mar 09 13:50:35 crc kubenswrapper[4764]: I0309 13:50:35.577229 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29e20119-f7d3-4b10-82c3-afbfa462c831" path="/var/lib/kubelet/pods/29e20119-f7d3-4b10-82c3-afbfa462c831/volumes" Mar 09 13:50:35 crc kubenswrapper[4764]: I0309 13:50:35.980354 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.104269 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-inventory\") pod \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.104444 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7snl\" (UniqueName: \"kubernetes.io/projected/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-kube-api-access-k7snl\") pod \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.104494 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-ssh-key-openstack-edpm-ipam\") pod \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.115165 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-kube-api-access-k7snl" (OuterVolumeSpecName: "kube-api-access-k7snl") pod "7ee15cfe-dd3c-4cc7-bf8f-b324397f4add" (UID: "7ee15cfe-dd3c-4cc7-bf8f-b324397f4add"). InnerVolumeSpecName "kube-api-access-k7snl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:50:36 crc kubenswrapper[4764]: E0309 13:50:36.130935 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-ssh-key-openstack-edpm-ipam podName:7ee15cfe-dd3c-4cc7-bf8f-b324397f4add nodeName:}" failed. No retries permitted until 2026-03-09 13:50:36.630890801 +0000 UTC m=+1791.881062709 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key-openstack-edpm-ipam" (UniqueName: "kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-ssh-key-openstack-edpm-ipam") pod "7ee15cfe-dd3c-4cc7-bf8f-b324397f4add" (UID: "7ee15cfe-dd3c-4cc7-bf8f-b324397f4add") : error deleting /var/lib/kubelet/pods/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add/volume-subpaths: remove /var/lib/kubelet/pods/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add/volume-subpaths: no such file or directory Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.134039 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-inventory" (OuterVolumeSpecName: "inventory") pod "7ee15cfe-dd3c-4cc7-bf8f-b324397f4add" (UID: "7ee15cfe-dd3c-4cc7-bf8f-b324397f4add"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.207898 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.207937 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7snl\" (UniqueName: \"kubernetes.io/projected/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-kube-api-access-k7snl\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.551363 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" event={"ID":"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add","Type":"ContainerDied","Data":"63d45c8e8448e981e4b409a90ffed5471ff40c994d9269a761fcbf3a1ba24c89"} Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.551855 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63d45c8e8448e981e4b409a90ffed5471ff40c994d9269a761fcbf3a1ba24c89" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.552136 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.658143 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh"] Mar 09 13:50:36 crc kubenswrapper[4764]: E0309 13:50:36.658614 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ee15cfe-dd3c-4cc7-bf8f-b324397f4add" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.658630 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee15cfe-dd3c-4cc7-bf8f-b324397f4add" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:50:36 crc kubenswrapper[4764]: E0309 13:50:36.658660 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ef871d-e371-41df-9380-53505557d7ac" containerName="oc" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.658667 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ef871d-e371-41df-9380-53505557d7ac" containerName="oc" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.658879 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ee15cfe-dd3c-4cc7-bf8f-b324397f4add" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.658890 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ef871d-e371-41df-9380-53505557d7ac" containerName="oc" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.659568 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh"] Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.659690 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.720770 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-ssh-key-openstack-edpm-ipam\") pod \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.725454 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7ee15cfe-dd3c-4cc7-bf8f-b324397f4add" (UID: "7ee15cfe-dd3c-4cc7-bf8f-b324397f4add"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.825356 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.825461 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4v4b\" (UniqueName: \"kubernetes.io/projected/9fc5b263-ac73-4b6e-8e41-4ed508765c55-kube-api-access-p4v4b\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.825536 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.825637 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.927376 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.927451 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4v4b\" (UniqueName: \"kubernetes.io/projected/9fc5b263-ac73-4b6e-8e41-4ed508765c55-kube-api-access-p4v4b\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.927521 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.930887 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.931030 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.951814 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4v4b\" (UniqueName: \"kubernetes.io/projected/9fc5b263-ac73-4b6e-8e41-4ed508765c55-kube-api-access-p4v4b\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:37 crc kubenswrapper[4764]: I0309 13:50:37.020837 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:37 crc kubenswrapper[4764]: I0309 13:50:37.612473 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh"] Mar 09 13:50:37 crc kubenswrapper[4764]: I0309 13:50:37.641245 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:50:38 crc kubenswrapper[4764]: I0309 13:50:38.613896 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" event={"ID":"9fc5b263-ac73-4b6e-8e41-4ed508765c55","Type":"ContainerStarted","Data":"3910a772e45d62c401146624cbc27497eee00dea57c88064e7e1af0b907bbfcc"} Mar 09 13:50:38 crc kubenswrapper[4764]: I0309 13:50:38.614294 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" event={"ID":"9fc5b263-ac73-4b6e-8e41-4ed508765c55","Type":"ContainerStarted","Data":"8d3ad1d27f26f2b3a6ec93499c393cc129f483e4e450e92a9faf68cba3228e37"} Mar 09 13:50:38 crc kubenswrapper[4764]: I0309 13:50:38.640488 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" podStartSLOduration=2.149468757 podStartE2EDuration="2.640464689s" podCreationTimestamp="2026-03-09 13:50:36 +0000 UTC" firstStartedPulling="2026-03-09 13:50:37.639172 +0000 UTC m=+1792.889343908" lastFinishedPulling="2026-03-09 13:50:38.130167932 +0000 UTC m=+1793.380339840" observedRunningTime="2026-03-09 13:50:38.631982373 +0000 UTC m=+1793.882154281" watchObservedRunningTime="2026-03-09 13:50:38.640464689 +0000 UTC m=+1793.890636597" Mar 09 13:50:40 crc kubenswrapper[4764]: I0309 13:50:40.560619 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:50:40 crc kubenswrapper[4764]: E0309 13:50:40.561329 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:50:41 crc kubenswrapper[4764]: I0309 13:50:41.033247 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-rjx7v"] Mar 09 13:50:41 crc kubenswrapper[4764]: I0309 13:50:41.041932 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-rjx7v"] Mar 09 13:50:41 crc kubenswrapper[4764]: I0309 13:50:41.572077 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea9388c2-526b-49ff-8f42-03ca66ae08dd" path="/var/lib/kubelet/pods/ea9388c2-526b-49ff-8f42-03ca66ae08dd/volumes" Mar 09 13:50:42 crc kubenswrapper[4764]: I0309 13:50:42.658600 4764 generic.go:334] "Generic (PLEG): container finished" podID="9fc5b263-ac73-4b6e-8e41-4ed508765c55" containerID="3910a772e45d62c401146624cbc27497eee00dea57c88064e7e1af0b907bbfcc" exitCode=0 Mar 09 13:50:42 crc kubenswrapper[4764]: I0309 13:50:42.658728 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" event={"ID":"9fc5b263-ac73-4b6e-8e41-4ed508765c55","Type":"ContainerDied","Data":"3910a772e45d62c401146624cbc27497eee00dea57c88064e7e1af0b907bbfcc"} Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.127967 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.312166 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4v4b\" (UniqueName: \"kubernetes.io/projected/9fc5b263-ac73-4b6e-8e41-4ed508765c55-kube-api-access-p4v4b\") pod \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.312488 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-ssh-key-openstack-edpm-ipam\") pod \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.312568 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-inventory\") pod \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.319913 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fc5b263-ac73-4b6e-8e41-4ed508765c55-kube-api-access-p4v4b" (OuterVolumeSpecName: "kube-api-access-p4v4b") pod "9fc5b263-ac73-4b6e-8e41-4ed508765c55" (UID: "9fc5b263-ac73-4b6e-8e41-4ed508765c55"). InnerVolumeSpecName "kube-api-access-p4v4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.344322 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-inventory" (OuterVolumeSpecName: "inventory") pod "9fc5b263-ac73-4b6e-8e41-4ed508765c55" (UID: "9fc5b263-ac73-4b6e-8e41-4ed508765c55"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.347945 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9fc5b263-ac73-4b6e-8e41-4ed508765c55" (UID: "9fc5b263-ac73-4b6e-8e41-4ed508765c55"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.415487 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4v4b\" (UniqueName: \"kubernetes.io/projected/9fc5b263-ac73-4b6e-8e41-4ed508765c55-kube-api-access-p4v4b\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.415837 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.415991 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.684576 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" event={"ID":"9fc5b263-ac73-4b6e-8e41-4ed508765c55","Type":"ContainerDied","Data":"8d3ad1d27f26f2b3a6ec93499c393cc129f483e4e450e92a9faf68cba3228e37"} Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.684631 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.684672 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d3ad1d27f26f2b3a6ec93499c393cc129f483e4e450e92a9faf68cba3228e37" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.769981 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w"] Mar 09 13:50:44 crc kubenswrapper[4764]: E0309 13:50:44.770676 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc5b263-ac73-4b6e-8e41-4ed508765c55" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.770699 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc5b263-ac73-4b6e-8e41-4ed508765c55" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.770920 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fc5b263-ac73-4b6e-8e41-4ed508765c55" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.771836 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.775066 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.775148 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.775351 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.775396 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.781707 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w"] Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.826129 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hl84w\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.826208 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p475q\" (UniqueName: \"kubernetes.io/projected/1dbc4eda-5f77-4951-962f-9ed0b1308df0-kube-api-access-p475q\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hl84w\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.826379 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hl84w\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.928962 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hl84w\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.929247 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hl84w\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.929338 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p475q\" (UniqueName: \"kubernetes.io/projected/1dbc4eda-5f77-4951-962f-9ed0b1308df0-kube-api-access-p475q\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hl84w\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.937053 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hl84w\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.939764 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hl84w\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.951250 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p475q\" (UniqueName: \"kubernetes.io/projected/1dbc4eda-5f77-4951-962f-9ed0b1308df0-kube-api-access-p475q\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hl84w\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:45 crc kubenswrapper[4764]: I0309 13:50:45.099627 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:45 crc kubenswrapper[4764]: I0309 13:50:45.672117 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w"] Mar 09 13:50:45 crc kubenswrapper[4764]: I0309 13:50:45.696584 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" event={"ID":"1dbc4eda-5f77-4951-962f-9ed0b1308df0","Type":"ContainerStarted","Data":"785c9329b6d9887145f8b240e44234fa1c35f72d9b221c2ae8340f8c7636b913"} Mar 09 13:50:46 crc kubenswrapper[4764]: I0309 13:50:46.110271 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:50:46 crc kubenswrapper[4764]: I0309 13:50:46.709689 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" event={"ID":"1dbc4eda-5f77-4951-962f-9ed0b1308df0","Type":"ContainerStarted","Data":"b4cf48a07b982624103805e852925a23f44a6c1f17fbc126f6f6ed00345f5ccd"} Mar 09 13:50:46 crc kubenswrapper[4764]: I0309 13:50:46.743202 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" podStartSLOduration=2.315529596 podStartE2EDuration="2.743175559s" podCreationTimestamp="2026-03-09 13:50:44 +0000 UTC" firstStartedPulling="2026-03-09 13:50:45.680063502 +0000 UTC m=+1800.930235400" lastFinishedPulling="2026-03-09 13:50:46.107709455 +0000 UTC m=+1801.357881363" observedRunningTime="2026-03-09 13:50:46.734022045 +0000 UTC m=+1801.984194003" watchObservedRunningTime="2026-03-09 13:50:46.743175559 +0000 UTC m=+1801.993347467" Mar 09 13:50:52 crc kubenswrapper[4764]: I0309 13:50:52.560160 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:50:52 crc kubenswrapper[4764]: E0309 13:50:52.561445 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:51:04 crc kubenswrapper[4764]: I0309 13:51:04.055996 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-cmhtp"] Mar 09 13:51:04 crc kubenswrapper[4764]: I0309 13:51:04.065914 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-cmhtp"] Mar 09 13:51:05 crc kubenswrapper[4764]: I0309 13:51:05.569592 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:51:05 crc kubenswrapper[4764]: E0309 13:51:05.570362 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:51:05 crc kubenswrapper[4764]: I0309 13:51:05.574996 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34466abc-30eb-4a0c-b4ea-50b5ab368fa1" path="/var/lib/kubelet/pods/34466abc-30eb-4a0c-b4ea-50b5ab368fa1/volumes" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.365343 4764 scope.go:117] "RemoveContainer" containerID="6b1dbed6bf6e61f7c12ad4f9dbf8d714fbaa8de7054f1b548bd8d7d1b560e4d6" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.423536 4764 scope.go:117] "RemoveContainer" containerID="6b21bf7421d738d162c19d823aaa8d5749330a4586aa8ead5eb3f5fbb8ebcd4e" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.451390 4764 scope.go:117] "RemoveContainer" containerID="339bad8557439d7afa5c329a435cd35a995246fc604b25e4a8e250f1a7b99368" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.503759 4764 scope.go:117] "RemoveContainer" containerID="bffbf528e6b05eccf8c0e302264ead1f4f679cc141d19e6aa19cb09ac29fba17" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.576446 4764 scope.go:117] "RemoveContainer" containerID="06f1bf05b0fa436ae59020308ffb3c9a4a73f8d30f844f8ee3828c34f9dbc702" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.622392 4764 scope.go:117] "RemoveContainer" containerID="cf71be9d097dd827ce8f90129691408cafc7024d1662f6cee61aed9d868f11b5" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.651259 4764 scope.go:117] "RemoveContainer" containerID="ccd695e4102eac8ad1bb7aec25a2c1252558ea5824f39855b788c7a3324a1c3b" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.675206 4764 scope.go:117] "RemoveContainer" containerID="77b865a9fe4889c82ec1f58b4a21addc379165d3a6897c87913c6042aa1f357b" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.723451 4764 scope.go:117] "RemoveContainer" containerID="63ff43316b127bbf8f65a169da3d3d1192d7c9af3a1493dff44991331dc6c723" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.751960 4764 scope.go:117] "RemoveContainer" containerID="da404af8a75d74966ac6aec712c910704da97b03d85b23af80911b87587e1ef7" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.777370 4764 scope.go:117] "RemoveContainer" containerID="8799a3258f6d77e33d1068648c90a3371654d54435ab51ff0aa268e01baf2e9b" Mar 09 13:51:09 crc kubenswrapper[4764]: I0309 13:51:09.035876 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-bnpcj"] Mar 09 13:51:09 crc kubenswrapper[4764]: I0309 13:51:09.043931 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-bnpcj"] Mar 09 13:51:09 crc kubenswrapper[4764]: I0309 13:51:09.581302 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1004910c-0db4-4e3d-aac5-358a557ee268" path="/var/lib/kubelet/pods/1004910c-0db4-4e3d-aac5-358a557ee268/volumes" Mar 09 13:51:11 crc kubenswrapper[4764]: I0309 13:51:11.044554 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-9sj6m"] Mar 09 13:51:11 crc kubenswrapper[4764]: I0309 13:51:11.056045 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-9sj6m"] Mar 09 13:51:11 crc kubenswrapper[4764]: I0309 13:51:11.580946 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a338463-1443-4863-830e-0621abc3ed15" path="/var/lib/kubelet/pods/2a338463-1443-4863-830e-0621abc3ed15/volumes" Mar 09 13:51:19 crc kubenswrapper[4764]: I0309 13:51:19.559873 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:51:19 crc kubenswrapper[4764]: E0309 13:51:19.560890 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:51:22 crc kubenswrapper[4764]: I0309 13:51:22.048889 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-x9gvc"] Mar 09 13:51:22 crc kubenswrapper[4764]: I0309 13:51:22.057424 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-x9gvc"] Mar 09 13:51:23 crc kubenswrapper[4764]: I0309 13:51:23.573492 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb54f57d-afb6-4e53-be9a-4b22573a9450" path="/var/lib/kubelet/pods/cb54f57d-afb6-4e53-be9a-4b22573a9450/volumes" Mar 09 13:51:30 crc kubenswrapper[4764]: I0309 13:51:30.039140 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-dp5x6"] Mar 09 13:51:30 crc kubenswrapper[4764]: I0309 13:51:30.049010 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-dp5x6"] Mar 09 13:51:31 crc kubenswrapper[4764]: I0309 13:51:31.166714 4764 generic.go:334] "Generic (PLEG): container finished" podID="1dbc4eda-5f77-4951-962f-9ed0b1308df0" containerID="b4cf48a07b982624103805e852925a23f44a6c1f17fbc126f6f6ed00345f5ccd" exitCode=0 Mar 09 13:51:31 crc kubenswrapper[4764]: I0309 13:51:31.166784 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" event={"ID":"1dbc4eda-5f77-4951-962f-9ed0b1308df0","Type":"ContainerDied","Data":"b4cf48a07b982624103805e852925a23f44a6c1f17fbc126f6f6ed00345f5ccd"} Mar 09 13:51:31 crc kubenswrapper[4764]: I0309 13:51:31.573760 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74146b7d-9780-4d2d-9454-853296f88955" path="/var/lib/kubelet/pods/74146b7d-9780-4d2d-9454-853296f88955/volumes" Mar 09 13:51:32 crc kubenswrapper[4764]: I0309 13:51:32.676486 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:51:32 crc kubenswrapper[4764]: I0309 13:51:32.732167 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-inventory\") pod \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " Mar 09 13:51:32 crc kubenswrapper[4764]: I0309 13:51:32.732744 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-ssh-key-openstack-edpm-ipam\") pod \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " Mar 09 13:51:32 crc kubenswrapper[4764]: I0309 13:51:32.733213 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p475q\" (UniqueName: \"kubernetes.io/projected/1dbc4eda-5f77-4951-962f-9ed0b1308df0-kube-api-access-p475q\") pod \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " Mar 09 13:51:32 crc kubenswrapper[4764]: I0309 13:51:32.740331 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dbc4eda-5f77-4951-962f-9ed0b1308df0-kube-api-access-p475q" (OuterVolumeSpecName: "kube-api-access-p475q") pod "1dbc4eda-5f77-4951-962f-9ed0b1308df0" (UID: "1dbc4eda-5f77-4951-962f-9ed0b1308df0"). InnerVolumeSpecName "kube-api-access-p475q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:51:32 crc kubenswrapper[4764]: I0309 13:51:32.765596 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1dbc4eda-5f77-4951-962f-9ed0b1308df0" (UID: "1dbc4eda-5f77-4951-962f-9ed0b1308df0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:51:32 crc kubenswrapper[4764]: I0309 13:51:32.765980 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-inventory" (OuterVolumeSpecName: "inventory") pod "1dbc4eda-5f77-4951-962f-9ed0b1308df0" (UID: "1dbc4eda-5f77-4951-962f-9ed0b1308df0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:51:32 crc kubenswrapper[4764]: I0309 13:51:32.834589 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p475q\" (UniqueName: \"kubernetes.io/projected/1dbc4eda-5f77-4951-962f-9ed0b1308df0-kube-api-access-p475q\") on node \"crc\" DevicePath \"\"" Mar 09 13:51:32 crc kubenswrapper[4764]: I0309 13:51:32.834630 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:51:32 crc kubenswrapper[4764]: I0309 13:51:32.834655 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.191151 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" event={"ID":"1dbc4eda-5f77-4951-962f-9ed0b1308df0","Type":"ContainerDied","Data":"785c9329b6d9887145f8b240e44234fa1c35f72d9b221c2ae8340f8c7636b913"} Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.191215 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="785c9329b6d9887145f8b240e44234fa1c35f72d9b221c2ae8340f8c7636b913" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.191237 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.282572 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ljpp5"] Mar 09 13:51:33 crc kubenswrapper[4764]: E0309 13:51:33.283033 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dbc4eda-5f77-4951-962f-9ed0b1308df0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.283055 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dbc4eda-5f77-4951-962f-9ed0b1308df0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.283225 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dbc4eda-5f77-4951-962f-9ed0b1308df0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.283957 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.287635 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.288562 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.288722 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.288946 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.299010 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ljpp5"] Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.347007 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ljpp5\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.347091 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b66q\" (UniqueName: \"kubernetes.io/projected/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-kube-api-access-8b66q\") pod \"ssh-known-hosts-edpm-deployment-ljpp5\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.347684 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ljpp5\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.449900 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ljpp5\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.450021 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ljpp5\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.450120 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b66q\" (UniqueName: \"kubernetes.io/projected/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-kube-api-access-8b66q\") pod \"ssh-known-hosts-edpm-deployment-ljpp5\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.456024 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ljpp5\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.458179 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ljpp5\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.471991 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b66q\" (UniqueName: \"kubernetes.io/projected/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-kube-api-access-8b66q\") pod \"ssh-known-hosts-edpm-deployment-ljpp5\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.561254 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:51:33 crc kubenswrapper[4764]: E0309 13:51:33.561486 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.679164 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:34 crc kubenswrapper[4764]: I0309 13:51:34.239689 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ljpp5"] Mar 09 13:51:35 crc kubenswrapper[4764]: I0309 13:51:35.215605 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" event={"ID":"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2","Type":"ContainerStarted","Data":"a382abef431f7e8dce93c27f5f5efe18631bdeb380db25088eb96acfd690b493"} Mar 09 13:51:35 crc kubenswrapper[4764]: I0309 13:51:35.216075 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" event={"ID":"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2","Type":"ContainerStarted","Data":"590f3b4e6365bbda04e720f9a91726b9088576faa0e67efa55c191f171559bef"} Mar 09 13:51:35 crc kubenswrapper[4764]: I0309 13:51:35.239304 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" podStartSLOduration=1.776795481 podStartE2EDuration="2.23928054s" podCreationTimestamp="2026-03-09 13:51:33 +0000 UTC" firstStartedPulling="2026-03-09 13:51:34.248662562 +0000 UTC m=+1849.498834480" lastFinishedPulling="2026-03-09 13:51:34.711147611 +0000 UTC m=+1849.961319539" observedRunningTime="2026-03-09 13:51:35.237610106 +0000 UTC m=+1850.487782064" watchObservedRunningTime="2026-03-09 13:51:35.23928054 +0000 UTC m=+1850.489452448" Mar 09 13:51:42 crc kubenswrapper[4764]: I0309 13:51:42.289864 4764 generic.go:334] "Generic (PLEG): container finished" podID="9ec29e7b-3537-459a-bfb4-acc93e1e5ec2" containerID="a382abef431f7e8dce93c27f5f5efe18631bdeb380db25088eb96acfd690b493" exitCode=0 Mar 09 13:51:42 crc kubenswrapper[4764]: I0309 13:51:42.290049 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" event={"ID":"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2","Type":"ContainerDied","Data":"a382abef431f7e8dce93c27f5f5efe18631bdeb380db25088eb96acfd690b493"} Mar 09 13:51:43 crc kubenswrapper[4764]: I0309 13:51:43.760345 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:43 crc kubenswrapper[4764]: I0309 13:51:43.801905 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-inventory-0\") pod \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " Mar 09 13:51:43 crc kubenswrapper[4764]: I0309 13:51:43.801996 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-ssh-key-openstack-edpm-ipam\") pod \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " Mar 09 13:51:43 crc kubenswrapper[4764]: I0309 13:51:43.802179 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b66q\" (UniqueName: \"kubernetes.io/projected/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-kube-api-access-8b66q\") pod \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " Mar 09 13:51:43 crc kubenswrapper[4764]: I0309 13:51:43.833321 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-kube-api-access-8b66q" (OuterVolumeSpecName: "kube-api-access-8b66q") pod "9ec29e7b-3537-459a-bfb4-acc93e1e5ec2" (UID: "9ec29e7b-3537-459a-bfb4-acc93e1e5ec2"). InnerVolumeSpecName "kube-api-access-8b66q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:51:43 crc kubenswrapper[4764]: I0309 13:51:43.836858 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9ec29e7b-3537-459a-bfb4-acc93e1e5ec2" (UID: "9ec29e7b-3537-459a-bfb4-acc93e1e5ec2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:51:43 crc kubenswrapper[4764]: I0309 13:51:43.861307 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "9ec29e7b-3537-459a-bfb4-acc93e1e5ec2" (UID: "9ec29e7b-3537-459a-bfb4-acc93e1e5ec2"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:51:43 crc kubenswrapper[4764]: I0309 13:51:43.905919 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b66q\" (UniqueName: \"kubernetes.io/projected/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-kube-api-access-8b66q\") on node \"crc\" DevicePath \"\"" Mar 09 13:51:43 crc kubenswrapper[4764]: I0309 13:51:43.905967 4764 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:51:43 crc kubenswrapper[4764]: I0309 13:51:43.905982 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.312074 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" event={"ID":"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2","Type":"ContainerDied","Data":"590f3b4e6365bbda04e720f9a91726b9088576faa0e67efa55c191f171559bef"} Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.312464 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="590f3b4e6365bbda04e720f9a91726b9088576faa0e67efa55c191f171559bef" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.312164 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.404461 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb"] Mar 09 13:51:44 crc kubenswrapper[4764]: E0309 13:51:44.405178 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ec29e7b-3537-459a-bfb4-acc93e1e5ec2" containerName="ssh-known-hosts-edpm-deployment" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.405207 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec29e7b-3537-459a-bfb4-acc93e1e5ec2" containerName="ssh-known-hosts-edpm-deployment" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.405432 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ec29e7b-3537-459a-bfb4-acc93e1e5ec2" containerName="ssh-known-hosts-edpm-deployment" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.406362 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.408633 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.408994 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.409033 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.409345 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.423392 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb"] Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.537465 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x4txb\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.537735 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x4txb\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.537803 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnhps\" (UniqueName: \"kubernetes.io/projected/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-kube-api-access-lnhps\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x4txb\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.639427 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x4txb\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.639818 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnhps\" (UniqueName: \"kubernetes.io/projected/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-kube-api-access-lnhps\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x4txb\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.640018 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x4txb\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.646383 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x4txb\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.646442 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x4txb\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.660067 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnhps\" (UniqueName: \"kubernetes.io/projected/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-kube-api-access-lnhps\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x4txb\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.728585 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:45 crc kubenswrapper[4764]: I0309 13:51:45.316370 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb"] Mar 09 13:51:45 crc kubenswrapper[4764]: I0309 13:51:45.570138 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:51:45 crc kubenswrapper[4764]: E0309 13:51:45.570968 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:51:45 crc kubenswrapper[4764]: I0309 13:51:45.841110 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:51:46 crc kubenswrapper[4764]: I0309 13:51:46.352146 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" event={"ID":"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c","Type":"ContainerStarted","Data":"b589f4b8fd5031689b4581120afd3a37af8b44cf26e7b85ac17d85bceae4a6f2"} Mar 09 13:51:46 crc kubenswrapper[4764]: I0309 13:51:46.352716 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" event={"ID":"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c","Type":"ContainerStarted","Data":"75cbd682df57607b22391ec4a112086032d72c0b62435c3d41b1c5b15dfde7fd"} Mar 09 13:51:46 crc kubenswrapper[4764]: I0309 13:51:46.372381 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" podStartSLOduration=1.871959952 podStartE2EDuration="2.372358882s" podCreationTimestamp="2026-03-09 13:51:44 +0000 UTC" firstStartedPulling="2026-03-09 13:51:45.337401291 +0000 UTC m=+1860.587573209" lastFinishedPulling="2026-03-09 13:51:45.837800211 +0000 UTC m=+1861.087972139" observedRunningTime="2026-03-09 13:51:46.368987202 +0000 UTC m=+1861.619159110" watchObservedRunningTime="2026-03-09 13:51:46.372358882 +0000 UTC m=+1861.622530790" Mar 09 13:51:53 crc kubenswrapper[4764]: I0309 13:51:53.423511 4764 generic.go:334] "Generic (PLEG): container finished" podID="d5f7b3e4-69e8-4529-9973-63b6af6b5e5c" containerID="b589f4b8fd5031689b4581120afd3a37af8b44cf26e7b85ac17d85bceae4a6f2" exitCode=0 Mar 09 13:51:53 crc kubenswrapper[4764]: I0309 13:51:53.423622 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" event={"ID":"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c","Type":"ContainerDied","Data":"b589f4b8fd5031689b4581120afd3a37af8b44cf26e7b85ac17d85bceae4a6f2"} Mar 09 13:51:54 crc kubenswrapper[4764]: I0309 13:51:54.882837 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:54 crc kubenswrapper[4764]: I0309 13:51:54.995772 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnhps\" (UniqueName: \"kubernetes.io/projected/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-kube-api-access-lnhps\") pod \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " Mar 09 13:51:54 crc kubenswrapper[4764]: I0309 13:51:54.996111 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-inventory\") pod \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " Mar 09 13:51:54 crc kubenswrapper[4764]: I0309 13:51:54.996151 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-ssh-key-openstack-edpm-ipam\") pod \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.003522 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-kube-api-access-lnhps" (OuterVolumeSpecName: "kube-api-access-lnhps") pod "d5f7b3e4-69e8-4529-9973-63b6af6b5e5c" (UID: "d5f7b3e4-69e8-4529-9973-63b6af6b5e5c"). InnerVolumeSpecName "kube-api-access-lnhps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.037208 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-inventory" (OuterVolumeSpecName: "inventory") pod "d5f7b3e4-69e8-4529-9973-63b6af6b5e5c" (UID: "d5f7b3e4-69e8-4529-9973-63b6af6b5e5c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.037526 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d5f7b3e4-69e8-4529-9973-63b6af6b5e5c" (UID: "d5f7b3e4-69e8-4529-9973-63b6af6b5e5c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.099499 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnhps\" (UniqueName: \"kubernetes.io/projected/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-kube-api-access-lnhps\") on node \"crc\" DevicePath \"\"" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.099572 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.099594 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.452133 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" event={"ID":"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c","Type":"ContainerDied","Data":"75cbd682df57607b22391ec4a112086032d72c0b62435c3d41b1c5b15dfde7fd"} Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.452189 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75cbd682df57607b22391ec4a112086032d72c0b62435c3d41b1c5b15dfde7fd" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.452238 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.637610 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw"] Mar 09 13:51:55 crc kubenswrapper[4764]: E0309 13:51:55.638144 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f7b3e4-69e8-4529-9973-63b6af6b5e5c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.638162 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f7b3e4-69e8-4529-9973-63b6af6b5e5c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.638356 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f7b3e4-69e8-4529-9973-63b6af6b5e5c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.639486 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.642191 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.642218 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.642538 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.642991 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.656999 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw"] Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.712337 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65n7r\" (UniqueName: \"kubernetes.io/projected/c0456561-9f16-4a32-b3ec-6ab6aa808b76-kube-api-access-65n7r\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.712425 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.712661 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.814896 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.815306 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.815553 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65n7r\" (UniqueName: \"kubernetes.io/projected/c0456561-9f16-4a32-b3ec-6ab6aa808b76-kube-api-access-65n7r\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.819508 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.822683 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.834631 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65n7r\" (UniqueName: \"kubernetes.io/projected/c0456561-9f16-4a32-b3ec-6ab6aa808b76-kube-api-access-65n7r\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.960711 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:56 crc kubenswrapper[4764]: I0309 13:51:56.530927 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw"] Mar 09 13:51:57 crc kubenswrapper[4764]: I0309 13:51:57.473173 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" event={"ID":"c0456561-9f16-4a32-b3ec-6ab6aa808b76","Type":"ContainerStarted","Data":"860e53941ce3fa2c04ed10b54cb5f6fa59f9d4631186670908ce0c966139e37f"} Mar 09 13:51:57 crc kubenswrapper[4764]: I0309 13:51:57.474853 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" event={"ID":"c0456561-9f16-4a32-b3ec-6ab6aa808b76","Type":"ContainerStarted","Data":"7d9162164851ca102e807aeca92eff9a13dffecebd98044e7f6059bda0fecc3a"} Mar 09 13:51:57 crc kubenswrapper[4764]: I0309 13:51:57.510843 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" podStartSLOduration=2.078802439 podStartE2EDuration="2.510810984s" podCreationTimestamp="2026-03-09 13:51:55 +0000 UTC" firstStartedPulling="2026-03-09 13:51:56.539717397 +0000 UTC m=+1871.789889305" lastFinishedPulling="2026-03-09 13:51:56.971725942 +0000 UTC m=+1872.221897850" observedRunningTime="2026-03-09 13:51:57.500998213 +0000 UTC m=+1872.751170171" watchObservedRunningTime="2026-03-09 13:51:57.510810984 +0000 UTC m=+1872.760982922" Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.142587 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551072-2q8rg"] Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.144485 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551072-2q8rg" Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.146811 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.147153 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.147791 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.153414 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551072-2q8rg"] Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.213971 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czbpz\" (UniqueName: \"kubernetes.io/projected/f422975f-b0ee-4ef9-be32-3aac0003a54d-kube-api-access-czbpz\") pod \"auto-csr-approver-29551072-2q8rg\" (UID: \"f422975f-b0ee-4ef9-be32-3aac0003a54d\") " pod="openshift-infra/auto-csr-approver-29551072-2q8rg" Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.316963 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czbpz\" (UniqueName: \"kubernetes.io/projected/f422975f-b0ee-4ef9-be32-3aac0003a54d-kube-api-access-czbpz\") pod \"auto-csr-approver-29551072-2q8rg\" (UID: \"f422975f-b0ee-4ef9-be32-3aac0003a54d\") " pod="openshift-infra/auto-csr-approver-29551072-2q8rg" Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.338102 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czbpz\" (UniqueName: \"kubernetes.io/projected/f422975f-b0ee-4ef9-be32-3aac0003a54d-kube-api-access-czbpz\") pod \"auto-csr-approver-29551072-2q8rg\" (UID: \"f422975f-b0ee-4ef9-be32-3aac0003a54d\") " pod="openshift-infra/auto-csr-approver-29551072-2q8rg" Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.479425 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551072-2q8rg" Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.560859 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:52:00 crc kubenswrapper[4764]: E0309 13:52:00.561190 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.939672 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551072-2q8rg"] Mar 09 13:52:00 crc kubenswrapper[4764]: W0309 13:52:00.946041 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf422975f_b0ee_4ef9_be32_3aac0003a54d.slice/crio-0dbd38e51658ebae4316351941269fda1f723a2cb9d62a11c545d824da270a51 WatchSource:0}: Error finding container 0dbd38e51658ebae4316351941269fda1f723a2cb9d62a11c545d824da270a51: Status 404 returned error can't find the container with id 0dbd38e51658ebae4316351941269fda1f723a2cb9d62a11c545d824da270a51 Mar 09 13:52:01 crc kubenswrapper[4764]: I0309 13:52:01.517849 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551072-2q8rg" event={"ID":"f422975f-b0ee-4ef9-be32-3aac0003a54d","Type":"ContainerStarted","Data":"0dbd38e51658ebae4316351941269fda1f723a2cb9d62a11c545d824da270a51"} Mar 09 13:52:02 crc kubenswrapper[4764]: I0309 13:52:02.532207 4764 generic.go:334] "Generic (PLEG): container finished" podID="f422975f-b0ee-4ef9-be32-3aac0003a54d" containerID="c5badba421cb7b57d824a0a46932addc59fbe992b12c15d75cb49304a00e2761" exitCode=0 Mar 09 13:52:02 crc kubenswrapper[4764]: I0309 13:52:02.532334 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551072-2q8rg" event={"ID":"f422975f-b0ee-4ef9-be32-3aac0003a54d","Type":"ContainerDied","Data":"c5badba421cb7b57d824a0a46932addc59fbe992b12c15d75cb49304a00e2761"} Mar 09 13:52:03 crc kubenswrapper[4764]: I0309 13:52:03.937348 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551072-2q8rg" Mar 09 13:52:04 crc kubenswrapper[4764]: I0309 13:52:04.004687 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czbpz\" (UniqueName: \"kubernetes.io/projected/f422975f-b0ee-4ef9-be32-3aac0003a54d-kube-api-access-czbpz\") pod \"f422975f-b0ee-4ef9-be32-3aac0003a54d\" (UID: \"f422975f-b0ee-4ef9-be32-3aac0003a54d\") " Mar 09 13:52:04 crc kubenswrapper[4764]: I0309 13:52:04.015387 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f422975f-b0ee-4ef9-be32-3aac0003a54d-kube-api-access-czbpz" (OuterVolumeSpecName: "kube-api-access-czbpz") pod "f422975f-b0ee-4ef9-be32-3aac0003a54d" (UID: "f422975f-b0ee-4ef9-be32-3aac0003a54d"). InnerVolumeSpecName "kube-api-access-czbpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:52:04 crc kubenswrapper[4764]: I0309 13:52:04.107018 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czbpz\" (UniqueName: \"kubernetes.io/projected/f422975f-b0ee-4ef9-be32-3aac0003a54d-kube-api-access-czbpz\") on node \"crc\" DevicePath \"\"" Mar 09 13:52:04 crc kubenswrapper[4764]: I0309 13:52:04.557592 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551072-2q8rg" event={"ID":"f422975f-b0ee-4ef9-be32-3aac0003a54d","Type":"ContainerDied","Data":"0dbd38e51658ebae4316351941269fda1f723a2cb9d62a11c545d824da270a51"} Mar 09 13:52:04 crc kubenswrapper[4764]: I0309 13:52:04.558218 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dbd38e51658ebae4316351941269fda1f723a2cb9d62a11c545d824da270a51" Mar 09 13:52:04 crc kubenswrapper[4764]: I0309 13:52:04.557678 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551072-2q8rg" Mar 09 13:52:05 crc kubenswrapper[4764]: I0309 13:52:05.021545 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551066-q2m88"] Mar 09 13:52:05 crc kubenswrapper[4764]: I0309 13:52:05.030972 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551066-q2m88"] Mar 09 13:52:05 crc kubenswrapper[4764]: I0309 13:52:05.589034 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f277802-4cc0-41e2-90f9-a9e2ac441979" path="/var/lib/kubelet/pods/2f277802-4cc0-41e2-90f9-a9e2ac441979/volumes" Mar 09 13:52:06 crc kubenswrapper[4764]: I0309 13:52:06.617552 4764 generic.go:334] "Generic (PLEG): container finished" podID="c0456561-9f16-4a32-b3ec-6ab6aa808b76" containerID="860e53941ce3fa2c04ed10b54cb5f6fa59f9d4631186670908ce0c966139e37f" exitCode=0 Mar 09 13:52:06 crc kubenswrapper[4764]: I0309 13:52:06.617616 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" event={"ID":"c0456561-9f16-4a32-b3ec-6ab6aa808b76","Type":"ContainerDied","Data":"860e53941ce3fa2c04ed10b54cb5f6fa59f9d4631186670908ce0c966139e37f"} Mar 09 13:52:07 crc kubenswrapper[4764]: I0309 13:52:07.025684 4764 scope.go:117] "RemoveContainer" containerID="bcb17cd274a1a85d7439c286f89b2351717de890e1de960b1f945d832ef377e9" Mar 09 13:52:07 crc kubenswrapper[4764]: I0309 13:52:07.073191 4764 scope.go:117] "RemoveContainer" containerID="ad8cc18bf0e9a68496606eb3c3aef5b4008faa06bd6343718c7f8c425fd14c07" Mar 09 13:52:07 crc kubenswrapper[4764]: I0309 13:52:07.129520 4764 scope.go:117] "RemoveContainer" containerID="e95470c676ddedadba89efafc3707fbce528908b0bfa879405e032128d81cc49" Mar 09 13:52:07 crc kubenswrapper[4764]: I0309 13:52:07.178231 4764 scope.go:117] "RemoveContainer" containerID="ec23e6a8c58e7f0bc7312e60b63b0c20314c347b59fda500de1bb63ca3c5e6c6" Mar 09 13:52:07 crc kubenswrapper[4764]: I0309 13:52:07.216774 4764 scope.go:117] "RemoveContainer" containerID="1a80c7f5de2d815f10d1b147b52b1c923bf4e3266278afd841a98b5bc66a8ad6" Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.012848 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.099804 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65n7r\" (UniqueName: \"kubernetes.io/projected/c0456561-9f16-4a32-b3ec-6ab6aa808b76-kube-api-access-65n7r\") pod \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.099902 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-ssh-key-openstack-edpm-ipam\") pod \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.100001 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-inventory\") pod \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.107041 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0456561-9f16-4a32-b3ec-6ab6aa808b76-kube-api-access-65n7r" (OuterVolumeSpecName: "kube-api-access-65n7r") pod "c0456561-9f16-4a32-b3ec-6ab6aa808b76" (UID: "c0456561-9f16-4a32-b3ec-6ab6aa808b76"). InnerVolumeSpecName "kube-api-access-65n7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.127509 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c0456561-9f16-4a32-b3ec-6ab6aa808b76" (UID: "c0456561-9f16-4a32-b3ec-6ab6aa808b76"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.133079 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-inventory" (OuterVolumeSpecName: "inventory") pod "c0456561-9f16-4a32-b3ec-6ab6aa808b76" (UID: "c0456561-9f16-4a32-b3ec-6ab6aa808b76"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.201278 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65n7r\" (UniqueName: \"kubernetes.io/projected/c0456561-9f16-4a32-b3ec-6ab6aa808b76-kube-api-access-65n7r\") on node \"crc\" DevicePath \"\"" Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.201309 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.201323 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.643611 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" event={"ID":"c0456561-9f16-4a32-b3ec-6ab6aa808b76","Type":"ContainerDied","Data":"7d9162164851ca102e807aeca92eff9a13dffecebd98044e7f6059bda0fecc3a"} Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.643689 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d9162164851ca102e807aeca92eff9a13dffecebd98044e7f6059bda0fecc3a" Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.643770 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:52:10 crc kubenswrapper[4764]: I0309 13:52:10.034307 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-tchwl"] Mar 09 13:52:10 crc kubenswrapper[4764]: I0309 13:52:10.047883 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-tchwl"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.038738 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-tbf9j"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.048337 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-mnqg7"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.057527 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-f7d8-account-create-update-rp748"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.069818 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-mnqg7"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.080367 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-tbf9j"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.090348 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cc98-account-create-update-sjfqm"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.098806 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-78eb-account-create-update-2dqgt"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.107921 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-f7d8-account-create-update-rp748"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.115776 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cc98-account-create-update-sjfqm"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.127106 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-78eb-account-create-update-2dqgt"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.572613 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3db8af07-1310-4cd5-be07-3fd062fe89a7" path="/var/lib/kubelet/pods/3db8af07-1310-4cd5-be07-3fd062fe89a7/volumes" Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.573463 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c" path="/var/lib/kubelet/pods/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c/volumes" Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.574243 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66003ca3-e579-4dab-b714-b5b2baa26bad" path="/var/lib/kubelet/pods/66003ca3-e579-4dab-b714-b5b2baa26bad/volumes" Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.575031 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fa35355-06e1-403f-9691-92398769ac09" path="/var/lib/kubelet/pods/8fa35355-06e1-403f-9691-92398769ac09/volumes" Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.576183 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a75ea85a-1e66-4e8d-92d7-6f9b766abfda" path="/var/lib/kubelet/pods/a75ea85a-1e66-4e8d-92d7-6f9b766abfda/volumes" Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.576807 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5daba6a-a01a-4400-aa87-01f9efd3abd8" path="/var/lib/kubelet/pods/b5daba6a-a01a-4400-aa87-01f9efd3abd8/volumes" Mar 09 13:52:12 crc kubenswrapper[4764]: I0309 13:52:12.560167 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:52:12 crc kubenswrapper[4764]: E0309 13:52:12.560535 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:52:23 crc kubenswrapper[4764]: I0309 13:52:23.559617 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:52:23 crc kubenswrapper[4764]: E0309 13:52:23.560826 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:52:38 crc kubenswrapper[4764]: I0309 13:52:38.560213 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:52:38 crc kubenswrapper[4764]: I0309 13:52:38.957672 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"c4a2ad399af00232f256b54f6dba8cd11d56a98b4177187d6164b2b8cca69e65"} Mar 09 13:52:39 crc kubenswrapper[4764]: I0309 13:52:39.059501 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kkcml"] Mar 09 13:52:39 crc kubenswrapper[4764]: I0309 13:52:39.071690 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kkcml"] Mar 09 13:52:39 crc kubenswrapper[4764]: I0309 13:52:39.572341 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0a40476-ff1d-443d-846f-a54cd956aaa3" path="/var/lib/kubelet/pods/c0a40476-ff1d-443d-846f-a54cd956aaa3/volumes" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.755234 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7zckj"] Mar 09 13:52:54 crc kubenswrapper[4764]: E0309 13:52:54.756566 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0456561-9f16-4a32-b3ec-6ab6aa808b76" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.756589 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0456561-9f16-4a32-b3ec-6ab6aa808b76" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:52:54 crc kubenswrapper[4764]: E0309 13:52:54.756611 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f422975f-b0ee-4ef9-be32-3aac0003a54d" containerName="oc" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.756619 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f422975f-b0ee-4ef9-be32-3aac0003a54d" containerName="oc" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.756850 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f422975f-b0ee-4ef9-be32-3aac0003a54d" containerName="oc" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.756862 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0456561-9f16-4a32-b3ec-6ab6aa808b76" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.758487 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.768920 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zckj"] Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.870421 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-utilities\") pod \"certified-operators-7zckj\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.870520 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-catalog-content\") pod \"certified-operators-7zckj\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.871033 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtj56\" (UniqueName: \"kubernetes.io/projected/53cd597b-146b-436f-9f54-1fa50726458b-kube-api-access-wtj56\") pod \"certified-operators-7zckj\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.973788 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-utilities\") pod \"certified-operators-7zckj\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.973933 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-catalog-content\") pod \"certified-operators-7zckj\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.974023 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtj56\" (UniqueName: \"kubernetes.io/projected/53cd597b-146b-436f-9f54-1fa50726458b-kube-api-access-wtj56\") pod \"certified-operators-7zckj\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.974431 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-utilities\") pod \"certified-operators-7zckj\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.974610 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-catalog-content\") pod \"certified-operators-7zckj\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:55 crc kubenswrapper[4764]: I0309 13:52:54.998432 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtj56\" (UniqueName: \"kubernetes.io/projected/53cd597b-146b-436f-9f54-1fa50726458b-kube-api-access-wtj56\") pod \"certified-operators-7zckj\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:55 crc kubenswrapper[4764]: I0309 13:52:55.084151 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:55 crc kubenswrapper[4764]: I0309 13:52:55.646574 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zckj"] Mar 09 13:52:56 crc kubenswrapper[4764]: I0309 13:52:56.129178 4764 generic.go:334] "Generic (PLEG): container finished" podID="53cd597b-146b-436f-9f54-1fa50726458b" containerID="0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0" exitCode=0 Mar 09 13:52:56 crc kubenswrapper[4764]: I0309 13:52:56.129316 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zckj" event={"ID":"53cd597b-146b-436f-9f54-1fa50726458b","Type":"ContainerDied","Data":"0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0"} Mar 09 13:52:56 crc kubenswrapper[4764]: I0309 13:52:56.130362 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zckj" event={"ID":"53cd597b-146b-436f-9f54-1fa50726458b","Type":"ContainerStarted","Data":"f91c88d7aa8ba5dc934caea9344568d20f4e3ca0531d94b4dcabea782162e573"} Mar 09 13:52:57 crc kubenswrapper[4764]: I0309 13:52:57.141032 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zckj" event={"ID":"53cd597b-146b-436f-9f54-1fa50726458b","Type":"ContainerStarted","Data":"52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c"} Mar 09 13:52:58 crc kubenswrapper[4764]: I0309 13:52:58.153411 4764 generic.go:334] "Generic (PLEG): container finished" podID="53cd597b-146b-436f-9f54-1fa50726458b" containerID="52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c" exitCode=0 Mar 09 13:52:58 crc kubenswrapper[4764]: I0309 13:52:58.153479 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zckj" event={"ID":"53cd597b-146b-436f-9f54-1fa50726458b","Type":"ContainerDied","Data":"52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c"} Mar 09 13:52:59 crc kubenswrapper[4764]: I0309 13:52:59.164534 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zckj" event={"ID":"53cd597b-146b-436f-9f54-1fa50726458b","Type":"ContainerStarted","Data":"919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9"} Mar 09 13:52:59 crc kubenswrapper[4764]: I0309 13:52:59.230396 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7zckj" podStartSLOduration=2.755607779 podStartE2EDuration="5.230368892s" podCreationTimestamp="2026-03-09 13:52:54 +0000 UTC" firstStartedPulling="2026-03-09 13:52:56.131165571 +0000 UTC m=+1931.381337469" lastFinishedPulling="2026-03-09 13:52:58.605926674 +0000 UTC m=+1933.856098582" observedRunningTime="2026-03-09 13:52:59.214013536 +0000 UTC m=+1934.464185464" watchObservedRunningTime="2026-03-09 13:52:59.230368892 +0000 UTC m=+1934.480540820" Mar 09 13:53:03 crc kubenswrapper[4764]: I0309 13:53:03.049141 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-5k46p"] Mar 09 13:53:03 crc kubenswrapper[4764]: I0309 13:53:03.063776 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nlxjx"] Mar 09 13:53:03 crc kubenswrapper[4764]: I0309 13:53:03.077517 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-5k46p"] Mar 09 13:53:03 crc kubenswrapper[4764]: I0309 13:53:03.091119 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nlxjx"] Mar 09 13:53:03 crc kubenswrapper[4764]: I0309 13:53:03.575193 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f09c604-028e-4965-aef8-6005ae365be9" path="/var/lib/kubelet/pods/9f09c604-028e-4965-aef8-6005ae365be9/volumes" Mar 09 13:53:03 crc kubenswrapper[4764]: I0309 13:53:03.576181 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b60c99da-3ae5-4340-bcb0-870731679c16" path="/var/lib/kubelet/pods/b60c99da-3ae5-4340-bcb0-870731679c16/volumes" Mar 09 13:53:05 crc kubenswrapper[4764]: I0309 13:53:05.085310 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:53:05 crc kubenswrapper[4764]: I0309 13:53:05.085548 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:53:05 crc kubenswrapper[4764]: I0309 13:53:05.136118 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:53:05 crc kubenswrapper[4764]: I0309 13:53:05.291237 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:53:07 crc kubenswrapper[4764]: I0309 13:53:07.414815 4764 scope.go:117] "RemoveContainer" containerID="83d6da43ad98ba05734f9db9d33ae5993158ca7ef62b6b91c4677e94546cef00" Mar 09 13:53:07 crc kubenswrapper[4764]: I0309 13:53:07.457033 4764 scope.go:117] "RemoveContainer" containerID="68e1ea9724cff446a48722f5d3beb35f95107e4f92a2482a35467b3ee65b6d69" Mar 09 13:53:07 crc kubenswrapper[4764]: I0309 13:53:07.479717 4764 scope.go:117] "RemoveContainer" containerID="d7855c57065ea118e0a7a66a2f52d85558ba845a318ccc1f11c06fba1afae771" Mar 09 13:53:07 crc kubenswrapper[4764]: I0309 13:53:07.542334 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zckj"] Mar 09 13:53:07 crc kubenswrapper[4764]: I0309 13:53:07.554913 4764 scope.go:117] "RemoveContainer" containerID="2c7f017bce7c92c14d6be609c4899f3333d712b1c6dc036ede4a982ad2477f70" Mar 09 13:53:07 crc kubenswrapper[4764]: I0309 13:53:07.580370 4764 scope.go:117] "RemoveContainer" containerID="ea1b24536350a8debd8d3a73db534419bac8b767469b0a62084cf6812d4fbd33" Mar 09 13:53:07 crc kubenswrapper[4764]: I0309 13:53:07.627539 4764 scope.go:117] "RemoveContainer" containerID="17c99031bb7ad39e3a6bf2e953eeaec9098edceedb7c2ac4d40bb3b9857101a1" Mar 09 13:53:07 crc kubenswrapper[4764]: I0309 13:53:07.670132 4764 scope.go:117] "RemoveContainer" containerID="5ac07a8dfc2c0d6d538f5fde06f5c7806d78d08f4964c253a85bccef288caf3e" Mar 09 13:53:07 crc kubenswrapper[4764]: I0309 13:53:07.697682 4764 scope.go:117] "RemoveContainer" containerID="01f8a44cba8ccff51deb3a73f62a17d6b208f1f8fb17e65f80892f3b0a90b431" Mar 09 13:53:07 crc kubenswrapper[4764]: I0309 13:53:07.738926 4764 scope.go:117] "RemoveContainer" containerID="d053ca844a6ffecdf233cee0796c2188368f4f6b2a5097474b6a14b438549f4b" Mar 09 13:53:08 crc kubenswrapper[4764]: I0309 13:53:08.268693 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7zckj" podUID="53cd597b-146b-436f-9f54-1fa50726458b" containerName="registry-server" containerID="cri-o://919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9" gracePeriod=2 Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.233858 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.294338 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtj56\" (UniqueName: \"kubernetes.io/projected/53cd597b-146b-436f-9f54-1fa50726458b-kube-api-access-wtj56\") pod \"53cd597b-146b-436f-9f54-1fa50726458b\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.294570 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-catalog-content\") pod \"53cd597b-146b-436f-9f54-1fa50726458b\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.294612 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-utilities\") pod \"53cd597b-146b-436f-9f54-1fa50726458b\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.296009 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-utilities" (OuterVolumeSpecName: "utilities") pod "53cd597b-146b-436f-9f54-1fa50726458b" (UID: "53cd597b-146b-436f-9f54-1fa50726458b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.299008 4764 generic.go:334] "Generic (PLEG): container finished" podID="53cd597b-146b-436f-9f54-1fa50726458b" containerID="919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9" exitCode=0 Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.299128 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.299128 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zckj" event={"ID":"53cd597b-146b-436f-9f54-1fa50726458b","Type":"ContainerDied","Data":"919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9"} Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.299615 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zckj" event={"ID":"53cd597b-146b-436f-9f54-1fa50726458b","Type":"ContainerDied","Data":"f91c88d7aa8ba5dc934caea9344568d20f4e3ca0531d94b4dcabea782162e573"} Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.299683 4764 scope.go:117] "RemoveContainer" containerID="919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.304043 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53cd597b-146b-436f-9f54-1fa50726458b-kube-api-access-wtj56" (OuterVolumeSpecName: "kube-api-access-wtj56") pod "53cd597b-146b-436f-9f54-1fa50726458b" (UID: "53cd597b-146b-436f-9f54-1fa50726458b"). InnerVolumeSpecName "kube-api-access-wtj56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.364625 4764 scope.go:117] "RemoveContainer" containerID="52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.381883 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53cd597b-146b-436f-9f54-1fa50726458b" (UID: "53cd597b-146b-436f-9f54-1fa50726458b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.388006 4764 scope.go:117] "RemoveContainer" containerID="0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.398053 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.398086 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.398099 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtj56\" (UniqueName: \"kubernetes.io/projected/53cd597b-146b-436f-9f54-1fa50726458b-kube-api-access-wtj56\") on node \"crc\" DevicePath \"\"" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.431407 4764 scope.go:117] "RemoveContainer" containerID="919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9" Mar 09 13:53:09 crc kubenswrapper[4764]: E0309 13:53:09.436313 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9\": container with ID starting with 919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9 not found: ID does not exist" containerID="919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.436369 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9"} err="failed to get container status \"919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9\": rpc error: code = NotFound desc = could not find container \"919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9\": container with ID starting with 919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9 not found: ID does not exist" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.436401 4764 scope.go:117] "RemoveContainer" containerID="52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c" Mar 09 13:53:09 crc kubenswrapper[4764]: E0309 13:53:09.437134 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c\": container with ID starting with 52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c not found: ID does not exist" containerID="52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.437160 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c"} err="failed to get container status \"52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c\": rpc error: code = NotFound desc = could not find container \"52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c\": container with ID starting with 52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c not found: ID does not exist" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.437176 4764 scope.go:117] "RemoveContainer" containerID="0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0" Mar 09 13:53:09 crc kubenswrapper[4764]: E0309 13:53:09.438869 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0\": container with ID starting with 0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0 not found: ID does not exist" containerID="0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.438905 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0"} err="failed to get container status \"0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0\": rpc error: code = NotFound desc = could not find container \"0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0\": container with ID starting with 0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0 not found: ID does not exist" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.637093 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zckj"] Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.651899 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7zckj"] Mar 09 13:53:11 crc kubenswrapper[4764]: I0309 13:53:11.585911 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53cd597b-146b-436f-9f54-1fa50726458b" path="/var/lib/kubelet/pods/53cd597b-146b-436f-9f54-1fa50726458b/volumes" Mar 09 13:53:48 crc kubenswrapper[4764]: I0309 13:53:48.052412 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-q8h4m"] Mar 09 13:53:48 crc kubenswrapper[4764]: I0309 13:53:48.092787 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-q8h4m"] Mar 09 13:53:49 crc kubenswrapper[4764]: I0309 13:53:49.574225 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d98526d5-8eaa-44a7-a25d-662a4fc8758b" path="/var/lib/kubelet/pods/d98526d5-8eaa-44a7-a25d-662a4fc8758b/volumes" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.163302 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551074-ztbcz"] Mar 09 13:54:00 crc kubenswrapper[4764]: E0309 13:54:00.165921 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53cd597b-146b-436f-9f54-1fa50726458b" containerName="registry-server" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.166044 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="53cd597b-146b-436f-9f54-1fa50726458b" containerName="registry-server" Mar 09 13:54:00 crc kubenswrapper[4764]: E0309 13:54:00.166147 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53cd597b-146b-436f-9f54-1fa50726458b" containerName="extract-utilities" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.166224 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="53cd597b-146b-436f-9f54-1fa50726458b" containerName="extract-utilities" Mar 09 13:54:00 crc kubenswrapper[4764]: E0309 13:54:00.166334 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53cd597b-146b-436f-9f54-1fa50726458b" containerName="extract-content" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.166412 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="53cd597b-146b-436f-9f54-1fa50726458b" containerName="extract-content" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.166791 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="53cd597b-146b-436f-9f54-1fa50726458b" containerName="registry-server" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.168026 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551074-ztbcz" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.171517 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.172156 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.172435 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.182786 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551074-ztbcz"] Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.262418 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj6fm\" (UniqueName: \"kubernetes.io/projected/909c58d5-d4d7-4042-94f0-df77bda9590a-kube-api-access-vj6fm\") pod \"auto-csr-approver-29551074-ztbcz\" (UID: \"909c58d5-d4d7-4042-94f0-df77bda9590a\") " pod="openshift-infra/auto-csr-approver-29551074-ztbcz" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.365536 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj6fm\" (UniqueName: \"kubernetes.io/projected/909c58d5-d4d7-4042-94f0-df77bda9590a-kube-api-access-vj6fm\") pod \"auto-csr-approver-29551074-ztbcz\" (UID: \"909c58d5-d4d7-4042-94f0-df77bda9590a\") " pod="openshift-infra/auto-csr-approver-29551074-ztbcz" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.397198 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj6fm\" (UniqueName: \"kubernetes.io/projected/909c58d5-d4d7-4042-94f0-df77bda9590a-kube-api-access-vj6fm\") pod \"auto-csr-approver-29551074-ztbcz\" (UID: \"909c58d5-d4d7-4042-94f0-df77bda9590a\") " pod="openshift-infra/auto-csr-approver-29551074-ztbcz" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.507955 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551074-ztbcz" Mar 09 13:54:01 crc kubenswrapper[4764]: I0309 13:54:01.015092 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551074-ztbcz"] Mar 09 13:54:01 crc kubenswrapper[4764]: W0309 13:54:01.024422 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod909c58d5_d4d7_4042_94f0_df77bda9590a.slice/crio-faf0a4667a66e59562a883a2c9fef864e6e3bfde2c7c8428277f182730288329 WatchSource:0}: Error finding container faf0a4667a66e59562a883a2c9fef864e6e3bfde2c7c8428277f182730288329: Status 404 returned error can't find the container with id faf0a4667a66e59562a883a2c9fef864e6e3bfde2c7c8428277f182730288329 Mar 09 13:54:01 crc kubenswrapper[4764]: I0309 13:54:01.837564 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551074-ztbcz" event={"ID":"909c58d5-d4d7-4042-94f0-df77bda9590a","Type":"ContainerStarted","Data":"faf0a4667a66e59562a883a2c9fef864e6e3bfde2c7c8428277f182730288329"} Mar 09 13:54:02 crc kubenswrapper[4764]: I0309 13:54:02.851936 4764 generic.go:334] "Generic (PLEG): container finished" podID="909c58d5-d4d7-4042-94f0-df77bda9590a" containerID="5510a27aa618536b31f12ced254c914aa21f71e4d8962e547b119bb29d1548f1" exitCode=0 Mar 09 13:54:02 crc kubenswrapper[4764]: I0309 13:54:02.852067 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551074-ztbcz" event={"ID":"909c58d5-d4d7-4042-94f0-df77bda9590a","Type":"ContainerDied","Data":"5510a27aa618536b31f12ced254c914aa21f71e4d8962e547b119bb29d1548f1"} Mar 09 13:54:04 crc kubenswrapper[4764]: I0309 13:54:04.182261 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551074-ztbcz" Mar 09 13:54:04 crc kubenswrapper[4764]: I0309 13:54:04.251740 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj6fm\" (UniqueName: \"kubernetes.io/projected/909c58d5-d4d7-4042-94f0-df77bda9590a-kube-api-access-vj6fm\") pod \"909c58d5-d4d7-4042-94f0-df77bda9590a\" (UID: \"909c58d5-d4d7-4042-94f0-df77bda9590a\") " Mar 09 13:54:04 crc kubenswrapper[4764]: I0309 13:54:04.258948 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/909c58d5-d4d7-4042-94f0-df77bda9590a-kube-api-access-vj6fm" (OuterVolumeSpecName: "kube-api-access-vj6fm") pod "909c58d5-d4d7-4042-94f0-df77bda9590a" (UID: "909c58d5-d4d7-4042-94f0-df77bda9590a"). InnerVolumeSpecName "kube-api-access-vj6fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:54:04 crc kubenswrapper[4764]: I0309 13:54:04.355607 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj6fm\" (UniqueName: \"kubernetes.io/projected/909c58d5-d4d7-4042-94f0-df77bda9590a-kube-api-access-vj6fm\") on node \"crc\" DevicePath \"\"" Mar 09 13:54:04 crc kubenswrapper[4764]: I0309 13:54:04.873348 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551074-ztbcz" event={"ID":"909c58d5-d4d7-4042-94f0-df77bda9590a","Type":"ContainerDied","Data":"faf0a4667a66e59562a883a2c9fef864e6e3bfde2c7c8428277f182730288329"} Mar 09 13:54:04 crc kubenswrapper[4764]: I0309 13:54:04.873410 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faf0a4667a66e59562a883a2c9fef864e6e3bfde2c7c8428277f182730288329" Mar 09 13:54:04 crc kubenswrapper[4764]: I0309 13:54:04.873464 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551074-ztbcz" Mar 09 13:54:05 crc kubenswrapper[4764]: I0309 13:54:05.261364 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551068-v6md5"] Mar 09 13:54:05 crc kubenswrapper[4764]: I0309 13:54:05.270864 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551068-v6md5"] Mar 09 13:54:05 crc kubenswrapper[4764]: I0309 13:54:05.575625 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab11b944-7857-4998-b32b-264ac7683616" path="/var/lib/kubelet/pods/ab11b944-7857-4998-b32b-264ac7683616/volumes" Mar 09 13:54:07 crc kubenswrapper[4764]: I0309 13:54:07.931189 4764 scope.go:117] "RemoveContainer" containerID="2a7ab8e616981504d9d31e4fc4313f083401f97f7e60e7b3cdc2825dfc09335b" Mar 09 13:54:07 crc kubenswrapper[4764]: I0309 13:54:07.993075 4764 scope.go:117] "RemoveContainer" containerID="858a961159a6d1cec6dd22f3429517714ebd1b6a16fd98b28235cb60fc1a94ee" Mar 09 13:54:58 crc kubenswrapper[4764]: I0309 13:54:58.370371 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:54:58 crc kubenswrapper[4764]: I0309 13:54:58.371077 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:55:28 crc kubenswrapper[4764]: I0309 13:55:28.370258 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:55:28 crc kubenswrapper[4764]: I0309 13:55:28.371051 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:55:58 crc kubenswrapper[4764]: I0309 13:55:58.371031 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:55:58 crc kubenswrapper[4764]: I0309 13:55:58.371967 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:55:58 crc kubenswrapper[4764]: I0309 13:55:58.372059 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:55:58 crc kubenswrapper[4764]: I0309 13:55:58.373422 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c4a2ad399af00232f256b54f6dba8cd11d56a98b4177187d6164b2b8cca69e65"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:55:58 crc kubenswrapper[4764]: I0309 13:55:58.373498 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://c4a2ad399af00232f256b54f6dba8cd11d56a98b4177187d6164b2b8cca69e65" gracePeriod=600 Mar 09 13:55:59 crc kubenswrapper[4764]: I0309 13:55:59.009927 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="c4a2ad399af00232f256b54f6dba8cd11d56a98b4177187d6164b2b8cca69e65" exitCode=0 Mar 09 13:55:59 crc kubenswrapper[4764]: I0309 13:55:59.010000 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"c4a2ad399af00232f256b54f6dba8cd11d56a98b4177187d6164b2b8cca69e65"} Mar 09 13:55:59 crc kubenswrapper[4764]: I0309 13:55:59.010755 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211"} Mar 09 13:55:59 crc kubenswrapper[4764]: I0309 13:55:59.010796 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.161191 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551076-qrqw5"] Mar 09 13:56:00 crc kubenswrapper[4764]: E0309 13:56:00.162352 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="909c58d5-d4d7-4042-94f0-df77bda9590a" containerName="oc" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.162373 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="909c58d5-d4d7-4042-94f0-df77bda9590a" containerName="oc" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.162614 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="909c58d5-d4d7-4042-94f0-df77bda9590a" containerName="oc" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.163705 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551076-qrqw5" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.166808 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.167483 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.167625 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.169939 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551076-qrqw5"] Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.258276 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8v9c\" (UniqueName: \"kubernetes.io/projected/7a9d864e-dad7-4c7d-a639-d4042bb3339d-kube-api-access-v8v9c\") pod \"auto-csr-approver-29551076-qrqw5\" (UID: \"7a9d864e-dad7-4c7d-a639-d4042bb3339d\") " pod="openshift-infra/auto-csr-approver-29551076-qrqw5" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.361843 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8v9c\" (UniqueName: \"kubernetes.io/projected/7a9d864e-dad7-4c7d-a639-d4042bb3339d-kube-api-access-v8v9c\") pod \"auto-csr-approver-29551076-qrqw5\" (UID: \"7a9d864e-dad7-4c7d-a639-d4042bb3339d\") " pod="openshift-infra/auto-csr-approver-29551076-qrqw5" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.389530 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8v9c\" (UniqueName: \"kubernetes.io/projected/7a9d864e-dad7-4c7d-a639-d4042bb3339d-kube-api-access-v8v9c\") pod \"auto-csr-approver-29551076-qrqw5\" (UID: \"7a9d864e-dad7-4c7d-a639-d4042bb3339d\") " pod="openshift-infra/auto-csr-approver-29551076-qrqw5" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.489271 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551076-qrqw5" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.960259 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551076-qrqw5"] Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.976971 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.034316 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.043763 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.046043 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551076-qrqw5" event={"ID":"7a9d864e-dad7-4c7d-a639-d4042bb3339d","Type":"ContainerStarted","Data":"71b0efab7d289e8510ef41b131942c784f08b8a9d5fbe745d921f9dc131aa089"} Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.058706 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.066906 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ljpp5"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.095106 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.105385 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.115332 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.123333 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.129993 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.137164 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.143933 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.152471 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.161095 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ljpp5"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.168500 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.175698 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.182276 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.189582 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.200863 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.215217 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.229958 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.571882 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07f61b11-aba4-469c-a5ed-9566f1951559" path="/var/lib/kubelet/pods/07f61b11-aba4-469c-a5ed-9566f1951559/volumes" Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.572788 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dbc4eda-5f77-4951-962f-9ed0b1308df0" path="/var/lib/kubelet/pods/1dbc4eda-5f77-4951-962f-9ed0b1308df0/volumes" Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.573401 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38445f30-348d-4c11-94c5-81bca885cc36" path="/var/lib/kubelet/pods/38445f30-348d-4c11-94c5-81bca885cc36/volumes" Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.573968 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ee15cfe-dd3c-4cc7-bf8f-b324397f4add" path="/var/lib/kubelet/pods/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add/volumes" Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.575092 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85b3887c-ea0d-4ca0-a862-0134f0ae08b5" path="/var/lib/kubelet/pods/85b3887c-ea0d-4ca0-a862-0134f0ae08b5/volumes" Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.575634 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ec29e7b-3537-459a-bfb4-acc93e1e5ec2" path="/var/lib/kubelet/pods/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2/volumes" Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.576156 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fc5b263-ac73-4b6e-8e41-4ed508765c55" path="/var/lib/kubelet/pods/9fc5b263-ac73-4b6e-8e41-4ed508765c55/volumes" Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.577195 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0456561-9f16-4a32-b3ec-6ab6aa808b76" path="/var/lib/kubelet/pods/c0456561-9f16-4a32-b3ec-6ab6aa808b76/volumes" Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.577740 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5f7b3e4-69e8-4529-9973-63b6af6b5e5c" path="/var/lib/kubelet/pods/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c/volumes" Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.578277 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe0d9990-083b-428b-baec-a40ae99487db" path="/var/lib/kubelet/pods/fe0d9990-083b-428b-baec-a40ae99487db/volumes" Mar 09 13:56:03 crc kubenswrapper[4764]: I0309 13:56:03.070112 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551076-qrqw5" event={"ID":"7a9d864e-dad7-4c7d-a639-d4042bb3339d","Type":"ContainerStarted","Data":"8f590f50cdda09a93b8757a7d03d71c1018f7d81bfe8e8784e29856175854a29"} Mar 09 13:56:03 crc kubenswrapper[4764]: I0309 13:56:03.102890 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551076-qrqw5" podStartSLOduration=1.472397691 podStartE2EDuration="3.102865283s" podCreationTimestamp="2026-03-09 13:56:00 +0000 UTC" firstStartedPulling="2026-03-09 13:56:00.976624903 +0000 UTC m=+2116.226796821" lastFinishedPulling="2026-03-09 13:56:02.607092505 +0000 UTC m=+2117.857264413" observedRunningTime="2026-03-09 13:56:03.090560625 +0000 UTC m=+2118.340732543" watchObservedRunningTime="2026-03-09 13:56:03.102865283 +0000 UTC m=+2118.353037201" Mar 09 13:56:04 crc kubenswrapper[4764]: I0309 13:56:04.082888 4764 generic.go:334] "Generic (PLEG): container finished" podID="7a9d864e-dad7-4c7d-a639-d4042bb3339d" containerID="8f590f50cdda09a93b8757a7d03d71c1018f7d81bfe8e8784e29856175854a29" exitCode=0 Mar 09 13:56:04 crc kubenswrapper[4764]: I0309 13:56:04.082959 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551076-qrqw5" event={"ID":"7a9d864e-dad7-4c7d-a639-d4042bb3339d","Type":"ContainerDied","Data":"8f590f50cdda09a93b8757a7d03d71c1018f7d81bfe8e8784e29856175854a29"} Mar 09 13:56:05 crc kubenswrapper[4764]: I0309 13:56:05.470234 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551076-qrqw5" Mar 09 13:56:05 crc kubenswrapper[4764]: I0309 13:56:05.577521 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8v9c\" (UniqueName: \"kubernetes.io/projected/7a9d864e-dad7-4c7d-a639-d4042bb3339d-kube-api-access-v8v9c\") pod \"7a9d864e-dad7-4c7d-a639-d4042bb3339d\" (UID: \"7a9d864e-dad7-4c7d-a639-d4042bb3339d\") " Mar 09 13:56:05 crc kubenswrapper[4764]: I0309 13:56:05.584745 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a9d864e-dad7-4c7d-a639-d4042bb3339d-kube-api-access-v8v9c" (OuterVolumeSpecName: "kube-api-access-v8v9c") pod "7a9d864e-dad7-4c7d-a639-d4042bb3339d" (UID: "7a9d864e-dad7-4c7d-a639-d4042bb3339d"). InnerVolumeSpecName "kube-api-access-v8v9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:56:05 crc kubenswrapper[4764]: I0309 13:56:05.680670 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8v9c\" (UniqueName: \"kubernetes.io/projected/7a9d864e-dad7-4c7d-a639-d4042bb3339d-kube-api-access-v8v9c\") on node \"crc\" DevicePath \"\"" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.104559 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551076-qrqw5" event={"ID":"7a9d864e-dad7-4c7d-a639-d4042bb3339d","Type":"ContainerDied","Data":"71b0efab7d289e8510ef41b131942c784f08b8a9d5fbe745d921f9dc131aa089"} Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.104889 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71b0efab7d289e8510ef41b131942c784f08b8a9d5fbe745d921f9dc131aa089" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.104696 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551076-qrqw5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.177611 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551070-x977g"] Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.191822 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551070-x977g"] Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.631956 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5"] Mar 09 13:56:06 crc kubenswrapper[4764]: E0309 13:56:06.634046 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9d864e-dad7-4c7d-a639-d4042bb3339d" containerName="oc" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.634174 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9d864e-dad7-4c7d-a639-d4042bb3339d" containerName="oc" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.634501 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a9d864e-dad7-4c7d-a639-d4042bb3339d" containerName="oc" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.635600 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.638879 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.639142 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.639477 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.639702 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.640508 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.648309 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5"] Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.701723 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.701781 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.701848 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.701901 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptkxs\" (UniqueName: \"kubernetes.io/projected/de1bf125-47e1-499c-9cfe-ffbd5c03d194-kube-api-access-ptkxs\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.702095 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.804007 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptkxs\" (UniqueName: \"kubernetes.io/projected/de1bf125-47e1-499c-9cfe-ffbd5c03d194-kube-api-access-ptkxs\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.804067 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.804180 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.804238 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.804279 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.809854 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.810059 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.812033 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.813773 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.826478 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptkxs\" (UniqueName: \"kubernetes.io/projected/de1bf125-47e1-499c-9cfe-ffbd5c03d194-kube-api-access-ptkxs\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.958005 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:07 crc kubenswrapper[4764]: I0309 13:56:07.477268 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5"] Mar 09 13:56:07 crc kubenswrapper[4764]: W0309 13:56:07.490257 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde1bf125_47e1_499c_9cfe_ffbd5c03d194.slice/crio-56f8a50aaf961365025d9562e61abf237ceb3b14bb828ea24cce3d59f02d7a83 WatchSource:0}: Error finding container 56f8a50aaf961365025d9562e61abf237ceb3b14bb828ea24cce3d59f02d7a83: Status 404 returned error can't find the container with id 56f8a50aaf961365025d9562e61abf237ceb3b14bb828ea24cce3d59f02d7a83 Mar 09 13:56:07 crc kubenswrapper[4764]: I0309 13:56:07.592375 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14ef871d-e371-41df-9380-53505557d7ac" path="/var/lib/kubelet/pods/14ef871d-e371-41df-9380-53505557d7ac/volumes" Mar 09 13:56:08 crc kubenswrapper[4764]: I0309 13:56:08.122959 4764 scope.go:117] "RemoveContainer" containerID="f0aca49c15e8c97fb96eb53f9577a50ff9c6c25e52f97ab9fc13ad081c1cd506" Mar 09 13:56:08 crc kubenswrapper[4764]: I0309 13:56:08.123672 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" event={"ID":"de1bf125-47e1-499c-9cfe-ffbd5c03d194","Type":"ContainerStarted","Data":"56f8a50aaf961365025d9562e61abf237ceb3b14bb828ea24cce3d59f02d7a83"} Mar 09 13:56:08 crc kubenswrapper[4764]: I0309 13:56:08.170043 4764 scope.go:117] "RemoveContainer" containerID="8097883da308046a256f53d8042acf3d3dddd33e51cc0f93ed512385c36b57c0" Mar 09 13:56:08 crc kubenswrapper[4764]: I0309 13:56:08.240543 4764 scope.go:117] "RemoveContainer" containerID="bde45a06faa972981715117bca5bea2ffa8f521b6f7a639569a4330b9afefe5a" Mar 09 13:56:08 crc kubenswrapper[4764]: I0309 13:56:08.274617 4764 scope.go:117] "RemoveContainer" containerID="1c7c071cdcf2595562a14fc8211f12dd741cead95f0450c54862b35cde78ac5b" Mar 09 13:56:08 crc kubenswrapper[4764]: I0309 13:56:08.318321 4764 scope.go:117] "RemoveContainer" containerID="6e4f8f0feb8ec9ecf660634549716b0c350cba173b98c033e4c7e09aa6bd108d" Mar 09 13:56:08 crc kubenswrapper[4764]: I0309 13:56:08.396679 4764 scope.go:117] "RemoveContainer" containerID="e8df8bc509784d8d529ce5673fdcec9ae8d5b4cbcf9d86d0b27b355b5bffd393" Mar 09 13:56:09 crc kubenswrapper[4764]: I0309 13:56:09.134492 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" event={"ID":"de1bf125-47e1-499c-9cfe-ffbd5c03d194","Type":"ContainerStarted","Data":"f3660ffb8bd5fb3872ac27e05718cfd3efdfa21835cdaa2254ffd4445a2924fc"} Mar 09 13:56:09 crc kubenswrapper[4764]: I0309 13:56:09.155114 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" podStartSLOduration=2.745221798 podStartE2EDuration="3.155096334s" podCreationTimestamp="2026-03-09 13:56:06 +0000 UTC" firstStartedPulling="2026-03-09 13:56:07.494026565 +0000 UTC m=+2122.744198483" lastFinishedPulling="2026-03-09 13:56:07.903901111 +0000 UTC m=+2123.154073019" observedRunningTime="2026-03-09 13:56:09.151326454 +0000 UTC m=+2124.401498352" watchObservedRunningTime="2026-03-09 13:56:09.155096334 +0000 UTC m=+2124.405268242" Mar 09 13:56:19 crc kubenswrapper[4764]: I0309 13:56:19.226858 4764 generic.go:334] "Generic (PLEG): container finished" podID="de1bf125-47e1-499c-9cfe-ffbd5c03d194" containerID="f3660ffb8bd5fb3872ac27e05718cfd3efdfa21835cdaa2254ffd4445a2924fc" exitCode=0 Mar 09 13:56:19 crc kubenswrapper[4764]: I0309 13:56:19.226984 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" event={"ID":"de1bf125-47e1-499c-9cfe-ffbd5c03d194","Type":"ContainerDied","Data":"f3660ffb8bd5fb3872ac27e05718cfd3efdfa21835cdaa2254ffd4445a2924fc"} Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.657088 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.721935 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ceph\") pod \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.721991 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptkxs\" (UniqueName: \"kubernetes.io/projected/de1bf125-47e1-499c-9cfe-ffbd5c03d194-kube-api-access-ptkxs\") pod \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.722085 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-inventory\") pod \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.722217 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ssh-key-openstack-edpm-ipam\") pod \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.722251 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-repo-setup-combined-ca-bundle\") pod \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.729909 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de1bf125-47e1-499c-9cfe-ffbd5c03d194-kube-api-access-ptkxs" (OuterVolumeSpecName: "kube-api-access-ptkxs") pod "de1bf125-47e1-499c-9cfe-ffbd5c03d194" (UID: "de1bf125-47e1-499c-9cfe-ffbd5c03d194"). InnerVolumeSpecName "kube-api-access-ptkxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.730048 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "de1bf125-47e1-499c-9cfe-ffbd5c03d194" (UID: "de1bf125-47e1-499c-9cfe-ffbd5c03d194"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.731245 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ceph" (OuterVolumeSpecName: "ceph") pod "de1bf125-47e1-499c-9cfe-ffbd5c03d194" (UID: "de1bf125-47e1-499c-9cfe-ffbd5c03d194"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.751954 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-inventory" (OuterVolumeSpecName: "inventory") pod "de1bf125-47e1-499c-9cfe-ffbd5c03d194" (UID: "de1bf125-47e1-499c-9cfe-ffbd5c03d194"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.766374 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "de1bf125-47e1-499c-9cfe-ffbd5c03d194" (UID: "de1bf125-47e1-499c-9cfe-ffbd5c03d194"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.825241 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.825288 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.825305 4764 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.825317 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.825330 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptkxs\" (UniqueName: \"kubernetes.io/projected/de1bf125-47e1-499c-9cfe-ffbd5c03d194-kube-api-access-ptkxs\") on node \"crc\" DevicePath \"\"" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.244241 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" event={"ID":"de1bf125-47e1-499c-9cfe-ffbd5c03d194","Type":"ContainerDied","Data":"56f8a50aaf961365025d9562e61abf237ceb3b14bb828ea24cce3d59f02d7a83"} Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.244300 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56f8a50aaf961365025d9562e61abf237ceb3b14bb828ea24cce3d59f02d7a83" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.244351 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.328270 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk"] Mar 09 13:56:21 crc kubenswrapper[4764]: E0309 13:56:21.328853 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1bf125-47e1-499c-9cfe-ffbd5c03d194" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.328878 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1bf125-47e1-499c-9cfe-ffbd5c03d194" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.329054 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="de1bf125-47e1-499c-9cfe-ffbd5c03d194" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.329871 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.334100 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.334121 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.334516 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.334673 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.334732 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.346601 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk"] Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.438111 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.438589 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.438681 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.438764 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4gb8\" (UniqueName: \"kubernetes.io/projected/7bd401e1-1592-4b49-8eb2-b6dcba296b36-kube-api-access-x4gb8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.438926 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.540440 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.540498 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.540539 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4gb8\" (UniqueName: \"kubernetes.io/projected/7bd401e1-1592-4b49-8eb2-b6dcba296b36-kube-api-access-x4gb8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.540635 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.541126 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.546367 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.546370 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.547849 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.550287 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.558673 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4gb8\" (UniqueName: \"kubernetes.io/projected/7bd401e1-1592-4b49-8eb2-b6dcba296b36-kube-api-access-x4gb8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.658613 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:22 crc kubenswrapper[4764]: W0309 13:56:22.205074 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bd401e1_1592_4b49_8eb2_b6dcba296b36.slice/crio-a87e4c6da3d7df1147833b51aecae9346095d370c45db0367159e0a123965636 WatchSource:0}: Error finding container a87e4c6da3d7df1147833b51aecae9346095d370c45db0367159e0a123965636: Status 404 returned error can't find the container with id a87e4c6da3d7df1147833b51aecae9346095d370c45db0367159e0a123965636 Mar 09 13:56:22 crc kubenswrapper[4764]: I0309 13:56:22.210295 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk"] Mar 09 13:56:22 crc kubenswrapper[4764]: I0309 13:56:22.255916 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" event={"ID":"7bd401e1-1592-4b49-8eb2-b6dcba296b36","Type":"ContainerStarted","Data":"a87e4c6da3d7df1147833b51aecae9346095d370c45db0367159e0a123965636"} Mar 09 13:56:23 crc kubenswrapper[4764]: I0309 13:56:23.267889 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" event={"ID":"7bd401e1-1592-4b49-8eb2-b6dcba296b36","Type":"ContainerStarted","Data":"2aa45ebc227a6925b794add5ccb7315a01e0a8cf206d439f62db0022923d1a2b"} Mar 09 13:56:23 crc kubenswrapper[4764]: I0309 13:56:23.292565 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" podStartSLOduration=1.549942337 podStartE2EDuration="2.29253615s" podCreationTimestamp="2026-03-09 13:56:21 +0000 UTC" firstStartedPulling="2026-03-09 13:56:22.209407891 +0000 UTC m=+2137.459579799" lastFinishedPulling="2026-03-09 13:56:22.952001704 +0000 UTC m=+2138.202173612" observedRunningTime="2026-03-09 13:56:23.287153516 +0000 UTC m=+2138.537325424" watchObservedRunningTime="2026-03-09 13:56:23.29253615 +0000 UTC m=+2138.542708058" Mar 09 13:57:08 crc kubenswrapper[4764]: I0309 13:57:08.573725 4764 scope.go:117] "RemoveContainer" containerID="3910a772e45d62c401146624cbc27497eee00dea57c88064e7e1af0b907bbfcc" Mar 09 13:57:08 crc kubenswrapper[4764]: I0309 13:57:08.616689 4764 scope.go:117] "RemoveContainer" containerID="b4cf48a07b982624103805e852925a23f44a6c1f17fbc126f6f6ed00345f5ccd" Mar 09 13:57:56 crc kubenswrapper[4764]: I0309 13:57:56.467488 4764 generic.go:334] "Generic (PLEG): container finished" podID="7bd401e1-1592-4b49-8eb2-b6dcba296b36" containerID="2aa45ebc227a6925b794add5ccb7315a01e0a8cf206d439f62db0022923d1a2b" exitCode=0 Mar 09 13:57:56 crc kubenswrapper[4764]: I0309 13:57:56.467578 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" event={"ID":"7bd401e1-1592-4b49-8eb2-b6dcba296b36","Type":"ContainerDied","Data":"2aa45ebc227a6925b794add5ccb7315a01e0a8cf206d439f62db0022923d1a2b"} Mar 09 13:57:57 crc kubenswrapper[4764]: I0309 13:57:57.950060 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.058607 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ssh-key-openstack-edpm-ipam\") pod \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.059082 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-inventory\") pod \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.059164 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4gb8\" (UniqueName: \"kubernetes.io/projected/7bd401e1-1592-4b49-8eb2-b6dcba296b36-kube-api-access-x4gb8\") pod \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.059219 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-bootstrap-combined-ca-bundle\") pod \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.059993 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ceph\") pod \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.067010 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bd401e1-1592-4b49-8eb2-b6dcba296b36-kube-api-access-x4gb8" (OuterVolumeSpecName: "kube-api-access-x4gb8") pod "7bd401e1-1592-4b49-8eb2-b6dcba296b36" (UID: "7bd401e1-1592-4b49-8eb2-b6dcba296b36"). InnerVolumeSpecName "kube-api-access-x4gb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.067876 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ceph" (OuterVolumeSpecName: "ceph") pod "7bd401e1-1592-4b49-8eb2-b6dcba296b36" (UID: "7bd401e1-1592-4b49-8eb2-b6dcba296b36"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.069956 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7bd401e1-1592-4b49-8eb2-b6dcba296b36" (UID: "7bd401e1-1592-4b49-8eb2-b6dcba296b36"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.088361 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7bd401e1-1592-4b49-8eb2-b6dcba296b36" (UID: "7bd401e1-1592-4b49-8eb2-b6dcba296b36"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.091976 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-inventory" (OuterVolumeSpecName: "inventory") pod "7bd401e1-1592-4b49-8eb2-b6dcba296b36" (UID: "7bd401e1-1592-4b49-8eb2-b6dcba296b36"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.163266 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.163481 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4gb8\" (UniqueName: \"kubernetes.io/projected/7bd401e1-1592-4b49-8eb2-b6dcba296b36-kube-api-access-x4gb8\") on node \"crc\" DevicePath \"\"" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.163615 4764 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.163703 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.163799 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.370785 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.370885 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.491846 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" event={"ID":"7bd401e1-1592-4b49-8eb2-b6dcba296b36","Type":"ContainerDied","Data":"a87e4c6da3d7df1147833b51aecae9346095d370c45db0367159e0a123965636"} Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.491910 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a87e4c6da3d7df1147833b51aecae9346095d370c45db0367159e0a123965636" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.491934 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.594037 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2"] Mar 09 13:57:58 crc kubenswrapper[4764]: E0309 13:57:58.594542 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bd401e1-1592-4b49-8eb2-b6dcba296b36" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.594567 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bd401e1-1592-4b49-8eb2-b6dcba296b36" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.594788 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bd401e1-1592-4b49-8eb2-b6dcba296b36" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.595492 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.599694 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.599799 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.600035 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.601865 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.602365 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.616610 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2"] Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.675009 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j42q\" (UniqueName: \"kubernetes.io/projected/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-kube-api-access-6j42q\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.675093 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.675484 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.675694 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.778157 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.778498 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.778683 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.778925 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j42q\" (UniqueName: \"kubernetes.io/projected/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-kube-api-access-6j42q\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.784442 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.785293 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.786461 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.800147 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j42q\" (UniqueName: \"kubernetes.io/projected/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-kube-api-access-6j42q\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.921378 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:59 crc kubenswrapper[4764]: I0309 13:57:59.511902 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2"] Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.198429 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551078-z7ms2"] Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.200788 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551078-z7ms2" Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.204598 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.205831 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.205833 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.207238 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551078-z7ms2"] Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.312266 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh64m\" (UniqueName: \"kubernetes.io/projected/f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9-kube-api-access-fh64m\") pod \"auto-csr-approver-29551078-z7ms2\" (UID: \"f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9\") " pod="openshift-infra/auto-csr-approver-29551078-z7ms2" Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.415041 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh64m\" (UniqueName: \"kubernetes.io/projected/f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9-kube-api-access-fh64m\") pod \"auto-csr-approver-29551078-z7ms2\" (UID: \"f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9\") " pod="openshift-infra/auto-csr-approver-29551078-z7ms2" Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.434093 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh64m\" (UniqueName: \"kubernetes.io/projected/f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9-kube-api-access-fh64m\") pod \"auto-csr-approver-29551078-z7ms2\" (UID: \"f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9\") " pod="openshift-infra/auto-csr-approver-29551078-z7ms2" Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.510016 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" event={"ID":"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd","Type":"ContainerStarted","Data":"f10ba1ca1d3dcf319b1b4430ee734cd598ce5903404c6d034a09dcda8740d33c"} Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.510077 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" event={"ID":"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd","Type":"ContainerStarted","Data":"29f661a48fc5aac47d0eb4e904f376b751a090d605e3f0814a502912e09baee7"} Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.528507 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" podStartSLOduration=1.949712946 podStartE2EDuration="2.528485579s" podCreationTimestamp="2026-03-09 13:57:58 +0000 UTC" firstStartedPulling="2026-03-09 13:57:59.522734424 +0000 UTC m=+2234.772906332" lastFinishedPulling="2026-03-09 13:58:00.101507047 +0000 UTC m=+2235.351678965" observedRunningTime="2026-03-09 13:58:00.525789507 +0000 UTC m=+2235.775961415" watchObservedRunningTime="2026-03-09 13:58:00.528485579 +0000 UTC m=+2235.778657487" Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.575681 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551078-z7ms2" Mar 09 13:58:01 crc kubenswrapper[4764]: I0309 13:58:01.025378 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551078-z7ms2"] Mar 09 13:58:01 crc kubenswrapper[4764]: W0309 13:58:01.035018 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf26fbbd6_fe1a_4ca6_82a8_e425edc3d3d9.slice/crio-185c831b6a62c35a276639340a74c1d8f2e9196ce4cf803064d20712a6e4370a WatchSource:0}: Error finding container 185c831b6a62c35a276639340a74c1d8f2e9196ce4cf803064d20712a6e4370a: Status 404 returned error can't find the container with id 185c831b6a62c35a276639340a74c1d8f2e9196ce4cf803064d20712a6e4370a Mar 09 13:58:01 crc kubenswrapper[4764]: I0309 13:58:01.522792 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551078-z7ms2" event={"ID":"f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9","Type":"ContainerStarted","Data":"185c831b6a62c35a276639340a74c1d8f2e9196ce4cf803064d20712a6e4370a"} Mar 09 13:58:02 crc kubenswrapper[4764]: I0309 13:58:02.533042 4764 generic.go:334] "Generic (PLEG): container finished" podID="f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9" containerID="e98df59174ca147f240e577b9eb7747712c4ce30b3cb22cafa3c795a0cc708fe" exitCode=0 Mar 09 13:58:02 crc kubenswrapper[4764]: I0309 13:58:02.533159 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551078-z7ms2" event={"ID":"f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9","Type":"ContainerDied","Data":"e98df59174ca147f240e577b9eb7747712c4ce30b3cb22cafa3c795a0cc708fe"} Mar 09 13:58:04 crc kubenswrapper[4764]: I0309 13:58:04.116661 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551078-z7ms2" Mar 09 13:58:04 crc kubenswrapper[4764]: I0309 13:58:04.199905 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh64m\" (UniqueName: \"kubernetes.io/projected/f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9-kube-api-access-fh64m\") pod \"f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9\" (UID: \"f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9\") " Mar 09 13:58:04 crc kubenswrapper[4764]: I0309 13:58:04.206701 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9-kube-api-access-fh64m" (OuterVolumeSpecName: "kube-api-access-fh64m") pod "f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9" (UID: "f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9"). InnerVolumeSpecName "kube-api-access-fh64m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:58:04 crc kubenswrapper[4764]: I0309 13:58:04.302741 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh64m\" (UniqueName: \"kubernetes.io/projected/f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9-kube-api-access-fh64m\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:04 crc kubenswrapper[4764]: I0309 13:58:04.557606 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551078-z7ms2" event={"ID":"f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9","Type":"ContainerDied","Data":"185c831b6a62c35a276639340a74c1d8f2e9196ce4cf803064d20712a6e4370a"} Mar 09 13:58:04 crc kubenswrapper[4764]: I0309 13:58:04.557730 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="185c831b6a62c35a276639340a74c1d8f2e9196ce4cf803064d20712a6e4370a" Mar 09 13:58:04 crc kubenswrapper[4764]: I0309 13:58:04.557733 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551078-z7ms2" Mar 09 13:58:05 crc kubenswrapper[4764]: I0309 13:58:05.201915 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551072-2q8rg"] Mar 09 13:58:05 crc kubenswrapper[4764]: I0309 13:58:05.219637 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551072-2q8rg"] Mar 09 13:58:05 crc kubenswrapper[4764]: I0309 13:58:05.571753 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f422975f-b0ee-4ef9-be32-3aac0003a54d" path="/var/lib/kubelet/pods/f422975f-b0ee-4ef9-be32-3aac0003a54d/volumes" Mar 09 13:58:08 crc kubenswrapper[4764]: I0309 13:58:08.720223 4764 scope.go:117] "RemoveContainer" containerID="c5badba421cb7b57d824a0a46932addc59fbe992b12c15d75cb49304a00e2761" Mar 09 13:58:08 crc kubenswrapper[4764]: I0309 13:58:08.763575 4764 scope.go:117] "RemoveContainer" containerID="860e53941ce3fa2c04ed10b54cb5f6fa59f9d4631186670908ce0c966139e37f" Mar 09 13:58:08 crc kubenswrapper[4764]: I0309 13:58:08.816979 4764 scope.go:117] "RemoveContainer" containerID="b589f4b8fd5031689b4581120afd3a37af8b44cf26e7b85ac17d85bceae4a6f2" Mar 09 13:58:08 crc kubenswrapper[4764]: I0309 13:58:08.858787 4764 scope.go:117] "RemoveContainer" containerID="a382abef431f7e8dce93c27f5f5efe18631bdeb380db25088eb96acfd690b493" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.760230 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f9bgf"] Mar 09 13:58:12 crc kubenswrapper[4764]: E0309 13:58:12.761225 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9" containerName="oc" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.761244 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9" containerName="oc" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.761489 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9" containerName="oc" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.785263 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.825348 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f9bgf"] Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.890799 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcfsx\" (UniqueName: \"kubernetes.io/projected/1931244b-286c-4ad0-88f6-8377df60b155-kube-api-access-gcfsx\") pod \"community-operators-f9bgf\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.890887 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-utilities\") pod \"community-operators-f9bgf\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.890908 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-catalog-content\") pod \"community-operators-f9bgf\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.992978 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-utilities\") pod \"community-operators-f9bgf\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.993045 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-catalog-content\") pod \"community-operators-f9bgf\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.993253 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcfsx\" (UniqueName: \"kubernetes.io/projected/1931244b-286c-4ad0-88f6-8377df60b155-kube-api-access-gcfsx\") pod \"community-operators-f9bgf\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.993791 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-utilities\") pod \"community-operators-f9bgf\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.993968 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-catalog-content\") pod \"community-operators-f9bgf\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:13 crc kubenswrapper[4764]: I0309 13:58:13.022911 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcfsx\" (UniqueName: \"kubernetes.io/projected/1931244b-286c-4ad0-88f6-8377df60b155-kube-api-access-gcfsx\") pod \"community-operators-f9bgf\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:13 crc kubenswrapper[4764]: I0309 13:58:13.128295 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:13 crc kubenswrapper[4764]: I0309 13:58:13.714216 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f9bgf"] Mar 09 13:58:14 crc kubenswrapper[4764]: I0309 13:58:14.666712 4764 generic.go:334] "Generic (PLEG): container finished" podID="1931244b-286c-4ad0-88f6-8377df60b155" containerID="9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db" exitCode=0 Mar 09 13:58:14 crc kubenswrapper[4764]: I0309 13:58:14.666783 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bgf" event={"ID":"1931244b-286c-4ad0-88f6-8377df60b155","Type":"ContainerDied","Data":"9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db"} Mar 09 13:58:14 crc kubenswrapper[4764]: I0309 13:58:14.667038 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bgf" event={"ID":"1931244b-286c-4ad0-88f6-8377df60b155","Type":"ContainerStarted","Data":"5a07e8c2852fbb585a65409f1e4c8c4d2e35c904ba61adc7668c8876210cbafa"} Mar 09 13:58:15 crc kubenswrapper[4764]: I0309 13:58:15.678093 4764 generic.go:334] "Generic (PLEG): container finished" podID="1931244b-286c-4ad0-88f6-8377df60b155" containerID="b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6" exitCode=0 Mar 09 13:58:15 crc kubenswrapper[4764]: I0309 13:58:15.678157 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bgf" event={"ID":"1931244b-286c-4ad0-88f6-8377df60b155","Type":"ContainerDied","Data":"b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6"} Mar 09 13:58:15 crc kubenswrapper[4764]: I0309 13:58:15.952076 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9ffpb"] Mar 09 13:58:15 crc kubenswrapper[4764]: I0309 13:58:15.954681 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:15 crc kubenswrapper[4764]: I0309 13:58:15.975617 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9ffpb"] Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.058673 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57mcq\" (UniqueName: \"kubernetes.io/projected/fc434dcc-281c-4972-8abf-f1353e818c92-kube-api-access-57mcq\") pod \"redhat-operators-9ffpb\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.058728 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-utilities\") pod \"redhat-operators-9ffpb\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.058766 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-catalog-content\") pod \"redhat-operators-9ffpb\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.160362 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-catalog-content\") pod \"redhat-operators-9ffpb\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.160568 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57mcq\" (UniqueName: \"kubernetes.io/projected/fc434dcc-281c-4972-8abf-f1353e818c92-kube-api-access-57mcq\") pod \"redhat-operators-9ffpb\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.160599 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-utilities\") pod \"redhat-operators-9ffpb\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.161145 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-utilities\") pod \"redhat-operators-9ffpb\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.161158 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-catalog-content\") pod \"redhat-operators-9ffpb\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.188345 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57mcq\" (UniqueName: \"kubernetes.io/projected/fc434dcc-281c-4972-8abf-f1353e818c92-kube-api-access-57mcq\") pod \"redhat-operators-9ffpb\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.290075 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.717895 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bgf" event={"ID":"1931244b-286c-4ad0-88f6-8377df60b155","Type":"ContainerStarted","Data":"3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1"} Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.756456 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f9bgf" podStartSLOduration=3.3328415590000002 podStartE2EDuration="4.756432112s" podCreationTimestamp="2026-03-09 13:58:12 +0000 UTC" firstStartedPulling="2026-03-09 13:58:14.66950032 +0000 UTC m=+2249.919672248" lastFinishedPulling="2026-03-09 13:58:16.093090893 +0000 UTC m=+2251.343262801" observedRunningTime="2026-03-09 13:58:16.750459443 +0000 UTC m=+2252.000631351" watchObservedRunningTime="2026-03-09 13:58:16.756432112 +0000 UTC m=+2252.006604030" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.839367 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9ffpb"] Mar 09 13:58:17 crc kubenswrapper[4764]: I0309 13:58:17.737117 4764 generic.go:334] "Generic (PLEG): container finished" podID="fc434dcc-281c-4972-8abf-f1353e818c92" containerID="61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e" exitCode=0 Mar 09 13:58:17 crc kubenswrapper[4764]: I0309 13:58:17.737195 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ffpb" event={"ID":"fc434dcc-281c-4972-8abf-f1353e818c92","Type":"ContainerDied","Data":"61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e"} Mar 09 13:58:17 crc kubenswrapper[4764]: I0309 13:58:17.737242 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ffpb" event={"ID":"fc434dcc-281c-4972-8abf-f1353e818c92","Type":"ContainerStarted","Data":"b93f8daca4444928beadd1d0399b3e6b30bb625dfbaf7a7e54bb81b9084ca36a"} Mar 09 13:58:18 crc kubenswrapper[4764]: I0309 13:58:18.748302 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ffpb" event={"ID":"fc434dcc-281c-4972-8abf-f1353e818c92","Type":"ContainerStarted","Data":"a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94"} Mar 09 13:58:22 crc kubenswrapper[4764]: I0309 13:58:22.785008 4764 generic.go:334] "Generic (PLEG): container finished" podID="fc434dcc-281c-4972-8abf-f1353e818c92" containerID="a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94" exitCode=0 Mar 09 13:58:22 crc kubenswrapper[4764]: I0309 13:58:22.785145 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ffpb" event={"ID":"fc434dcc-281c-4972-8abf-f1353e818c92","Type":"ContainerDied","Data":"a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94"} Mar 09 13:58:23 crc kubenswrapper[4764]: I0309 13:58:23.138673 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:23 crc kubenswrapper[4764]: I0309 13:58:23.138763 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:23 crc kubenswrapper[4764]: I0309 13:58:23.198161 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:23 crc kubenswrapper[4764]: I0309 13:58:23.796070 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ffpb" event={"ID":"fc434dcc-281c-4972-8abf-f1353e818c92","Type":"ContainerStarted","Data":"732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0"} Mar 09 13:58:23 crc kubenswrapper[4764]: I0309 13:58:23.828818 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9ffpb" podStartSLOduration=3.339945283 podStartE2EDuration="8.828789313s" podCreationTimestamp="2026-03-09 13:58:15 +0000 UTC" firstStartedPulling="2026-03-09 13:58:17.73944178 +0000 UTC m=+2252.989613698" lastFinishedPulling="2026-03-09 13:58:23.22828582 +0000 UTC m=+2258.478457728" observedRunningTime="2026-03-09 13:58:23.821514968 +0000 UTC m=+2259.071686896" watchObservedRunningTime="2026-03-09 13:58:23.828789313 +0000 UTC m=+2259.078961221" Mar 09 13:58:23 crc kubenswrapper[4764]: I0309 13:58:23.844492 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:24 crc kubenswrapper[4764]: I0309 13:58:24.807969 4764 generic.go:334] "Generic (PLEG): container finished" podID="5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd" containerID="f10ba1ca1d3dcf319b1b4430ee734cd598ce5903404c6d034a09dcda8740d33c" exitCode=0 Mar 09 13:58:24 crc kubenswrapper[4764]: I0309 13:58:24.808085 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" event={"ID":"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd","Type":"ContainerDied","Data":"f10ba1ca1d3dcf319b1b4430ee734cd598ce5903404c6d034a09dcda8740d33c"} Mar 09 13:58:25 crc kubenswrapper[4764]: I0309 13:58:25.148012 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f9bgf"] Mar 09 13:58:25 crc kubenswrapper[4764]: I0309 13:58:25.816727 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f9bgf" podUID="1931244b-286c-4ad0-88f6-8377df60b155" containerName="registry-server" containerID="cri-o://3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1" gracePeriod=2 Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.291135 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.291726 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.377139 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.384884 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.514095 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ceph\") pod \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.514379 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j42q\" (UniqueName: \"kubernetes.io/projected/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-kube-api-access-6j42q\") pod \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.514440 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcfsx\" (UniqueName: \"kubernetes.io/projected/1931244b-286c-4ad0-88f6-8377df60b155-kube-api-access-gcfsx\") pod \"1931244b-286c-4ad0-88f6-8377df60b155\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.514470 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ssh-key-openstack-edpm-ipam\") pod \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.514497 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-catalog-content\") pod \"1931244b-286c-4ad0-88f6-8377df60b155\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.514520 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-inventory\") pod \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.514584 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-utilities\") pod \"1931244b-286c-4ad0-88f6-8377df60b155\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.515800 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-utilities" (OuterVolumeSpecName: "utilities") pod "1931244b-286c-4ad0-88f6-8377df60b155" (UID: "1931244b-286c-4ad0-88f6-8377df60b155"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.521236 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1931244b-286c-4ad0-88f6-8377df60b155-kube-api-access-gcfsx" (OuterVolumeSpecName: "kube-api-access-gcfsx") pod "1931244b-286c-4ad0-88f6-8377df60b155" (UID: "1931244b-286c-4ad0-88f6-8377df60b155"). InnerVolumeSpecName "kube-api-access-gcfsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.521350 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ceph" (OuterVolumeSpecName: "ceph") pod "5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd" (UID: "5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.522857 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-kube-api-access-6j42q" (OuterVolumeSpecName: "kube-api-access-6j42q") pod "5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd" (UID: "5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd"). InnerVolumeSpecName "kube-api-access-6j42q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.545587 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd" (UID: "5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.546986 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-inventory" (OuterVolumeSpecName: "inventory") pod "5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd" (UID: "5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.576446 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1931244b-286c-4ad0-88f6-8377df60b155" (UID: "1931244b-286c-4ad0-88f6-8377df60b155"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.616556 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcfsx\" (UniqueName: \"kubernetes.io/projected/1931244b-286c-4ad0-88f6-8377df60b155-kube-api-access-gcfsx\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.616603 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.616613 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.616627 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.616730 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.616746 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.616757 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j42q\" (UniqueName: \"kubernetes.io/projected/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-kube-api-access-6j42q\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.828944 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" event={"ID":"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd","Type":"ContainerDied","Data":"29f661a48fc5aac47d0eb4e904f376b751a090d605e3f0814a502912e09baee7"} Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.829397 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29f661a48fc5aac47d0eb4e904f376b751a090d605e3f0814a502912e09baee7" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.829015 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.831437 4764 generic.go:334] "Generic (PLEG): container finished" podID="1931244b-286c-4ad0-88f6-8377df60b155" containerID="3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1" exitCode=0 Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.831516 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bgf" event={"ID":"1931244b-286c-4ad0-88f6-8377df60b155","Type":"ContainerDied","Data":"3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1"} Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.831560 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bgf" event={"ID":"1931244b-286c-4ad0-88f6-8377df60b155","Type":"ContainerDied","Data":"5a07e8c2852fbb585a65409f1e4c8c4d2e35c904ba61adc7668c8876210cbafa"} Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.831572 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.831579 4764 scope.go:117] "RemoveContainer" containerID="3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.863038 4764 scope.go:117] "RemoveContainer" containerID="b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.900072 4764 scope.go:117] "RemoveContainer" containerID="9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.905063 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f9bgf"] Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.914633 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f9bgf"] Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.926951 4764 scope.go:117] "RemoveContainer" containerID="3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1" Mar 09 13:58:26 crc kubenswrapper[4764]: E0309 13:58:26.930985 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1\": container with ID starting with 3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1 not found: ID does not exist" containerID="3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.931050 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1"} err="failed to get container status \"3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1\": rpc error: code = NotFound desc = could not find container \"3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1\": container with ID starting with 3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1 not found: ID does not exist" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.931092 4764 scope.go:117] "RemoveContainer" containerID="b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6" Mar 09 13:58:26 crc kubenswrapper[4764]: E0309 13:58:26.931469 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6\": container with ID starting with b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6 not found: ID does not exist" containerID="b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.931499 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6"} err="failed to get container status \"b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6\": rpc error: code = NotFound desc = could not find container \"b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6\": container with ID starting with b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6 not found: ID does not exist" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.931516 4764 scope.go:117] "RemoveContainer" containerID="9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db" Mar 09 13:58:26 crc kubenswrapper[4764]: E0309 13:58:26.932990 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db\": container with ID starting with 9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db not found: ID does not exist" containerID="9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.933023 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db"} err="failed to get container status \"9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db\": rpc error: code = NotFound desc = could not find container \"9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db\": container with ID starting with 9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db not found: ID does not exist" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.944841 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m"] Mar 09 13:58:26 crc kubenswrapper[4764]: E0309 13:58:26.945417 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1931244b-286c-4ad0-88f6-8377df60b155" containerName="extract-content" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.945437 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1931244b-286c-4ad0-88f6-8377df60b155" containerName="extract-content" Mar 09 13:58:26 crc kubenswrapper[4764]: E0309 13:58:26.945456 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1931244b-286c-4ad0-88f6-8377df60b155" containerName="extract-utilities" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.945463 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1931244b-286c-4ad0-88f6-8377df60b155" containerName="extract-utilities" Mar 09 13:58:26 crc kubenswrapper[4764]: E0309 13:58:26.945479 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1931244b-286c-4ad0-88f6-8377df60b155" containerName="registry-server" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.945485 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1931244b-286c-4ad0-88f6-8377df60b155" containerName="registry-server" Mar 09 13:58:26 crc kubenswrapper[4764]: E0309 13:58:26.945505 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.945512 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.945712 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.945729 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1931244b-286c-4ad0-88f6-8377df60b155" containerName="registry-server" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.946526 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.949982 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.950396 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.950584 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.951048 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.952036 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.956430 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m"] Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.028315 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.028381 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.028633 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzwdg\" (UniqueName: \"kubernetes.io/projected/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-kube-api-access-pzwdg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.028857 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.130533 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.130619 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.130709 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzwdg\" (UniqueName: \"kubernetes.io/projected/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-kube-api-access-pzwdg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.130831 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.136187 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.136268 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.140261 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.155165 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzwdg\" (UniqueName: \"kubernetes.io/projected/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-kube-api-access-pzwdg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.308174 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.337905 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9ffpb" podUID="fc434dcc-281c-4972-8abf-f1353e818c92" containerName="registry-server" probeResult="failure" output=< Mar 09 13:58:27 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 09 13:58:27 crc kubenswrapper[4764]: > Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.607068 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1931244b-286c-4ad0-88f6-8377df60b155" path="/var/lib/kubelet/pods/1931244b-286c-4ad0-88f6-8377df60b155/volumes" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.920105 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m"] Mar 09 13:58:27 crc kubenswrapper[4764]: W0309 13:58:27.925015 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93b0ad6c_7720_4b43_b65c_83b7b7a8c3ab.slice/crio-86eca36a2fd702dd87d457eb0f60f37cd82a4160e83acda801e0cfe2b332e2c9 WatchSource:0}: Error finding container 86eca36a2fd702dd87d457eb0f60f37cd82a4160e83acda801e0cfe2b332e2c9: Status 404 returned error can't find the container with id 86eca36a2fd702dd87d457eb0f60f37cd82a4160e83acda801e0cfe2b332e2c9 Mar 09 13:58:28 crc kubenswrapper[4764]: I0309 13:58:28.370078 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:58:28 crc kubenswrapper[4764]: I0309 13:58:28.370415 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:58:28 crc kubenswrapper[4764]: I0309 13:58:28.855252 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" event={"ID":"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab","Type":"ContainerStarted","Data":"68b7fc8b415d7fd18af743d52f845a521dd96c27435d9abcf4b8b98fd1b7cbb5"} Mar 09 13:58:28 crc kubenswrapper[4764]: I0309 13:58:28.855627 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" event={"ID":"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab","Type":"ContainerStarted","Data":"86eca36a2fd702dd87d457eb0f60f37cd82a4160e83acda801e0cfe2b332e2c9"} Mar 09 13:58:28 crc kubenswrapper[4764]: I0309 13:58:28.883975 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" podStartSLOduration=2.451100662 podStartE2EDuration="2.88395125s" podCreationTimestamp="2026-03-09 13:58:26 +0000 UTC" firstStartedPulling="2026-03-09 13:58:27.932352631 +0000 UTC m=+2263.182524539" lastFinishedPulling="2026-03-09 13:58:28.365203219 +0000 UTC m=+2263.615375127" observedRunningTime="2026-03-09 13:58:28.875954417 +0000 UTC m=+2264.126126335" watchObservedRunningTime="2026-03-09 13:58:28.88395125 +0000 UTC m=+2264.134123168" Mar 09 13:58:33 crc kubenswrapper[4764]: I0309 13:58:33.902476 4764 generic.go:334] "Generic (PLEG): container finished" podID="93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab" containerID="68b7fc8b415d7fd18af743d52f845a521dd96c27435d9abcf4b8b98fd1b7cbb5" exitCode=0 Mar 09 13:58:33 crc kubenswrapper[4764]: I0309 13:58:33.902711 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" event={"ID":"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab","Type":"ContainerDied","Data":"68b7fc8b415d7fd18af743d52f845a521dd96c27435d9abcf4b8b98fd1b7cbb5"} Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.387531 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.583333 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-inventory\") pod \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.583492 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ssh-key-openstack-edpm-ipam\") pod \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.583580 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ceph\") pod \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.583663 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzwdg\" (UniqueName: \"kubernetes.io/projected/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-kube-api-access-pzwdg\") pod \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.602993 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ceph" (OuterVolumeSpecName: "ceph") pod "93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab" (UID: "93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.603969 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-kube-api-access-pzwdg" (OuterVolumeSpecName: "kube-api-access-pzwdg") pod "93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab" (UID: "93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab"). InnerVolumeSpecName "kube-api-access-pzwdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.614540 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-inventory" (OuterVolumeSpecName: "inventory") pod "93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab" (UID: "93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.617931 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab" (UID: "93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.687267 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.687334 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.687354 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzwdg\" (UniqueName: \"kubernetes.io/projected/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-kube-api-access-pzwdg\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.687371 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.932010 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" event={"ID":"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab","Type":"ContainerDied","Data":"86eca36a2fd702dd87d457eb0f60f37cd82a4160e83acda801e0cfe2b332e2c9"} Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.932083 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86eca36a2fd702dd87d457eb0f60f37cd82a4160e83acda801e0cfe2b332e2c9" Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.932127 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.021757 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p"] Mar 09 13:58:36 crc kubenswrapper[4764]: E0309 13:58:36.022385 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.022416 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.022686 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.023678 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.026564 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.026628 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.026755 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.027074 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.031959 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p"] Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.042345 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.198371 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzzj2\" (UniqueName: \"kubernetes.io/projected/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-kube-api-access-dzzj2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.198449 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.198479 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.198946 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.301322 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.301569 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzzj2\" (UniqueName: \"kubernetes.io/projected/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-kube-api-access-dzzj2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.301631 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.301730 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.305927 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.306832 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.310545 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.329532 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzzj2\" (UniqueName: \"kubernetes.io/projected/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-kube-api-access-dzzj2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.337626 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.350443 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.395713 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.899864 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p"] Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.946365 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" event={"ID":"2d2ddcdd-77bf-4dc5-8170-02d297378dcb","Type":"ContainerStarted","Data":"be0e4cdf0dcf35386be7b745eaa83c77d9a655ec0e8c4111fb3dec8d28ca749e"} Mar 09 13:58:37 crc kubenswrapper[4764]: I0309 13:58:37.748205 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9ffpb"] Mar 09 13:58:37 crc kubenswrapper[4764]: I0309 13:58:37.958775 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" event={"ID":"2d2ddcdd-77bf-4dc5-8170-02d297378dcb","Type":"ContainerStarted","Data":"e0d17ea39e7555c976bc1934118f99c866e5a434f1c8b871b8eaacb7b52e87e5"} Mar 09 13:58:37 crc kubenswrapper[4764]: I0309 13:58:37.959092 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9ffpb" podUID="fc434dcc-281c-4972-8abf-f1353e818c92" containerName="registry-server" containerID="cri-o://732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0" gracePeriod=2 Mar 09 13:58:37 crc kubenswrapper[4764]: I0309 13:58:37.984543 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" podStartSLOduration=2.548110863 podStartE2EDuration="2.984521147s" podCreationTimestamp="2026-03-09 13:58:35 +0000 UTC" firstStartedPulling="2026-03-09 13:58:36.904066398 +0000 UTC m=+2272.154238296" lastFinishedPulling="2026-03-09 13:58:37.340476672 +0000 UTC m=+2272.590648580" observedRunningTime="2026-03-09 13:58:37.983026817 +0000 UTC m=+2273.233198725" watchObservedRunningTime="2026-03-09 13:58:37.984521147 +0000 UTC m=+2273.234693055" Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.408261 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.564123 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-utilities\") pod \"fc434dcc-281c-4972-8abf-f1353e818c92\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.564250 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57mcq\" (UniqueName: \"kubernetes.io/projected/fc434dcc-281c-4972-8abf-f1353e818c92-kube-api-access-57mcq\") pod \"fc434dcc-281c-4972-8abf-f1353e818c92\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.564545 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-catalog-content\") pod \"fc434dcc-281c-4972-8abf-f1353e818c92\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.565076 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-utilities" (OuterVolumeSpecName: "utilities") pod "fc434dcc-281c-4972-8abf-f1353e818c92" (UID: "fc434dcc-281c-4972-8abf-f1353e818c92"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.580071 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc434dcc-281c-4972-8abf-f1353e818c92-kube-api-access-57mcq" (OuterVolumeSpecName: "kube-api-access-57mcq") pod "fc434dcc-281c-4972-8abf-f1353e818c92" (UID: "fc434dcc-281c-4972-8abf-f1353e818c92"). InnerVolumeSpecName "kube-api-access-57mcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.667390 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.667435 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57mcq\" (UniqueName: \"kubernetes.io/projected/fc434dcc-281c-4972-8abf-f1353e818c92-kube-api-access-57mcq\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.718146 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc434dcc-281c-4972-8abf-f1353e818c92" (UID: "fc434dcc-281c-4972-8abf-f1353e818c92"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.770050 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.974007 4764 generic.go:334] "Generic (PLEG): container finished" podID="fc434dcc-281c-4972-8abf-f1353e818c92" containerID="732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0" exitCode=0 Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.974105 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.974093 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ffpb" event={"ID":"fc434dcc-281c-4972-8abf-f1353e818c92","Type":"ContainerDied","Data":"732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0"} Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.974714 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ffpb" event={"ID":"fc434dcc-281c-4972-8abf-f1353e818c92","Type":"ContainerDied","Data":"b93f8daca4444928beadd1d0399b3e6b30bb625dfbaf7a7e54bb81b9084ca36a"} Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.974756 4764 scope.go:117] "RemoveContainer" containerID="732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0" Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.009566 4764 scope.go:117] "RemoveContainer" containerID="a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94" Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.024498 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9ffpb"] Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.031885 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9ffpb"] Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.049705 4764 scope.go:117] "RemoveContainer" containerID="61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e" Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.091993 4764 scope.go:117] "RemoveContainer" containerID="732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0" Mar 09 13:58:39 crc kubenswrapper[4764]: E0309 13:58:39.092916 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0\": container with ID starting with 732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0 not found: ID does not exist" containerID="732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0" Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.092983 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0"} err="failed to get container status \"732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0\": rpc error: code = NotFound desc = could not find container \"732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0\": container with ID starting with 732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0 not found: ID does not exist" Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.093039 4764 scope.go:117] "RemoveContainer" containerID="a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94" Mar 09 13:58:39 crc kubenswrapper[4764]: E0309 13:58:39.093761 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94\": container with ID starting with a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94 not found: ID does not exist" containerID="a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94" Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.093807 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94"} err="failed to get container status \"a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94\": rpc error: code = NotFound desc = could not find container \"a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94\": container with ID starting with a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94 not found: ID does not exist" Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.093841 4764 scope.go:117] "RemoveContainer" containerID="61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e" Mar 09 13:58:39 crc kubenswrapper[4764]: E0309 13:58:39.094461 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e\": container with ID starting with 61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e not found: ID does not exist" containerID="61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e" Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.094503 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e"} err="failed to get container status \"61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e\": rpc error: code = NotFound desc = could not find container \"61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e\": container with ID starting with 61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e not found: ID does not exist" Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.572524 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc434dcc-281c-4972-8abf-f1353e818c92" path="/var/lib/kubelet/pods/fc434dcc-281c-4972-8abf-f1353e818c92/volumes" Mar 09 13:58:58 crc kubenswrapper[4764]: I0309 13:58:58.370636 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:58:58 crc kubenswrapper[4764]: I0309 13:58:58.371331 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:58:58 crc kubenswrapper[4764]: I0309 13:58:58.371380 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:58:58 crc kubenswrapper[4764]: I0309 13:58:58.372326 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:58:58 crc kubenswrapper[4764]: I0309 13:58:58.372398 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" gracePeriod=600 Mar 09 13:58:58 crc kubenswrapper[4764]: E0309 13:58:58.505728 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:58:59 crc kubenswrapper[4764]: I0309 13:58:59.177379 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" exitCode=0 Mar 09 13:58:59 crc kubenswrapper[4764]: I0309 13:58:59.177434 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211"} Mar 09 13:58:59 crc kubenswrapper[4764]: I0309 13:58:59.177926 4764 scope.go:117] "RemoveContainer" containerID="c4a2ad399af00232f256b54f6dba8cd11d56a98b4177187d6164b2b8cca69e65" Mar 09 13:58:59 crc kubenswrapper[4764]: I0309 13:58:59.179112 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 13:58:59 crc kubenswrapper[4764]: E0309 13:58:59.179609 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:59:09 crc kubenswrapper[4764]: I0309 13:59:09.277815 4764 generic.go:334] "Generic (PLEG): container finished" podID="2d2ddcdd-77bf-4dc5-8170-02d297378dcb" containerID="e0d17ea39e7555c976bc1934118f99c866e5a434f1c8b871b8eaacb7b52e87e5" exitCode=0 Mar 09 13:59:09 crc kubenswrapper[4764]: I0309 13:59:09.277867 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" event={"ID":"2d2ddcdd-77bf-4dc5-8170-02d297378dcb","Type":"ContainerDied","Data":"e0d17ea39e7555c976bc1934118f99c866e5a434f1c8b871b8eaacb7b52e87e5"} Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.735519 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.820593 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzzj2\" (UniqueName: \"kubernetes.io/projected/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-kube-api-access-dzzj2\") pod \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.820732 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-inventory\") pod \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.821002 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ceph\") pod \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.821081 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ssh-key-openstack-edpm-ipam\") pod \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.827407 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ceph" (OuterVolumeSpecName: "ceph") pod "2d2ddcdd-77bf-4dc5-8170-02d297378dcb" (UID: "2d2ddcdd-77bf-4dc5-8170-02d297378dcb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.837247 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-kube-api-access-dzzj2" (OuterVolumeSpecName: "kube-api-access-dzzj2") pod "2d2ddcdd-77bf-4dc5-8170-02d297378dcb" (UID: "2d2ddcdd-77bf-4dc5-8170-02d297378dcb"). InnerVolumeSpecName "kube-api-access-dzzj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.850277 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2d2ddcdd-77bf-4dc5-8170-02d297378dcb" (UID: "2d2ddcdd-77bf-4dc5-8170-02d297378dcb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.850361 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-inventory" (OuterVolumeSpecName: "inventory") pod "2d2ddcdd-77bf-4dc5-8170-02d297378dcb" (UID: "2d2ddcdd-77bf-4dc5-8170-02d297378dcb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.926143 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.926477 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzzj2\" (UniqueName: \"kubernetes.io/projected/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-kube-api-access-dzzj2\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.926556 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.926631 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.297141 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" event={"ID":"2d2ddcdd-77bf-4dc5-8170-02d297378dcb","Type":"ContainerDied","Data":"be0e4cdf0dcf35386be7b745eaa83c77d9a655ec0e8c4111fb3dec8d28ca749e"} Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.297220 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be0e4cdf0dcf35386be7b745eaa83c77d9a655ec0e8c4111fb3dec8d28ca749e" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.297237 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.400942 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv"] Mar 09 13:59:11 crc kubenswrapper[4764]: E0309 13:59:11.401475 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc434dcc-281c-4972-8abf-f1353e818c92" containerName="registry-server" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.401504 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc434dcc-281c-4972-8abf-f1353e818c92" containerName="registry-server" Mar 09 13:59:11 crc kubenswrapper[4764]: E0309 13:59:11.401533 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d2ddcdd-77bf-4dc5-8170-02d297378dcb" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.401546 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2ddcdd-77bf-4dc5-8170-02d297378dcb" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:59:11 crc kubenswrapper[4764]: E0309 13:59:11.401568 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc434dcc-281c-4972-8abf-f1353e818c92" containerName="extract-content" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.401577 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc434dcc-281c-4972-8abf-f1353e818c92" containerName="extract-content" Mar 09 13:59:11 crc kubenswrapper[4764]: E0309 13:59:11.401589 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc434dcc-281c-4972-8abf-f1353e818c92" containerName="extract-utilities" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.401597 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc434dcc-281c-4972-8abf-f1353e818c92" containerName="extract-utilities" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.401893 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d2ddcdd-77bf-4dc5-8170-02d297378dcb" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.401946 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc434dcc-281c-4972-8abf-f1353e818c92" containerName="registry-server" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.402832 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.405084 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.405127 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.405587 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.405789 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.410582 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.414431 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv"] Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.545037 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.545160 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.545211 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs48l\" (UniqueName: \"kubernetes.io/projected/cbffc6a1-81df-479c-b40e-3f865c187a73-kube-api-access-xs48l\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.545242 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.646885 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.647407 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs48l\" (UniqueName: \"kubernetes.io/projected/cbffc6a1-81df-479c-b40e-3f865c187a73-kube-api-access-xs48l\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.647531 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.648267 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.652400 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.652644 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.655254 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.665350 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs48l\" (UniqueName: \"kubernetes.io/projected/cbffc6a1-81df-479c-b40e-3f865c187a73-kube-api-access-xs48l\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.721589 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:12 crc kubenswrapper[4764]: I0309 13:59:12.268122 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv"] Mar 09 13:59:12 crc kubenswrapper[4764]: I0309 13:59:12.308682 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" event={"ID":"cbffc6a1-81df-479c-b40e-3f865c187a73","Type":"ContainerStarted","Data":"2851c1e5435a239fbe006f90989fb7fbe06dda075e0fa33347b764f78afacfae"} Mar 09 13:59:13 crc kubenswrapper[4764]: I0309 13:59:13.320033 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" event={"ID":"cbffc6a1-81df-479c-b40e-3f865c187a73","Type":"ContainerStarted","Data":"a872ead50e271b964852737e0eae6103b43599f8554d92a0cf781c21b93321d7"} Mar 09 13:59:13 crc kubenswrapper[4764]: I0309 13:59:13.345043 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" podStartSLOduration=1.805215078 podStartE2EDuration="2.345017991s" podCreationTimestamp="2026-03-09 13:59:11 +0000 UTC" firstStartedPulling="2026-03-09 13:59:12.275309019 +0000 UTC m=+2307.525480927" lastFinishedPulling="2026-03-09 13:59:12.815111932 +0000 UTC m=+2308.065283840" observedRunningTime="2026-03-09 13:59:13.342878444 +0000 UTC m=+2308.593050352" watchObservedRunningTime="2026-03-09 13:59:13.345017991 +0000 UTC m=+2308.595189909" Mar 09 13:59:14 crc kubenswrapper[4764]: I0309 13:59:14.561024 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 13:59:14 crc kubenswrapper[4764]: E0309 13:59:14.562148 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:59:17 crc kubenswrapper[4764]: I0309 13:59:17.377695 4764 generic.go:334] "Generic (PLEG): container finished" podID="cbffc6a1-81df-479c-b40e-3f865c187a73" containerID="a872ead50e271b964852737e0eae6103b43599f8554d92a0cf781c21b93321d7" exitCode=0 Mar 09 13:59:17 crc kubenswrapper[4764]: I0309 13:59:17.377801 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" event={"ID":"cbffc6a1-81df-479c-b40e-3f865c187a73","Type":"ContainerDied","Data":"a872ead50e271b964852737e0eae6103b43599f8554d92a0cf781c21b93321d7"} Mar 09 13:59:18 crc kubenswrapper[4764]: I0309 13:59:18.857873 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:18 crc kubenswrapper[4764]: I0309 13:59:18.913598 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ceph\") pod \"cbffc6a1-81df-479c-b40e-3f865c187a73\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " Mar 09 13:59:18 crc kubenswrapper[4764]: I0309 13:59:18.913764 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-inventory\") pod \"cbffc6a1-81df-479c-b40e-3f865c187a73\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " Mar 09 13:59:18 crc kubenswrapper[4764]: I0309 13:59:18.913924 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs48l\" (UniqueName: \"kubernetes.io/projected/cbffc6a1-81df-479c-b40e-3f865c187a73-kube-api-access-xs48l\") pod \"cbffc6a1-81df-479c-b40e-3f865c187a73\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " Mar 09 13:59:18 crc kubenswrapper[4764]: I0309 13:59:18.914015 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ssh-key-openstack-edpm-ipam\") pod \"cbffc6a1-81df-479c-b40e-3f865c187a73\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " Mar 09 13:59:18 crc kubenswrapper[4764]: I0309 13:59:18.921751 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ceph" (OuterVolumeSpecName: "ceph") pod "cbffc6a1-81df-479c-b40e-3f865c187a73" (UID: "cbffc6a1-81df-479c-b40e-3f865c187a73"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:59:18 crc kubenswrapper[4764]: I0309 13:59:18.923935 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbffc6a1-81df-479c-b40e-3f865c187a73-kube-api-access-xs48l" (OuterVolumeSpecName: "kube-api-access-xs48l") pod "cbffc6a1-81df-479c-b40e-3f865c187a73" (UID: "cbffc6a1-81df-479c-b40e-3f865c187a73"). InnerVolumeSpecName "kube-api-access-xs48l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:59:18 crc kubenswrapper[4764]: E0309 13:59:18.942823 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ssh-key-openstack-edpm-ipam podName:cbffc6a1-81df-479c-b40e-3f865c187a73 nodeName:}" failed. No retries permitted until 2026-03-09 13:59:19.442779766 +0000 UTC m=+2314.692951674 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key-openstack-edpm-ipam" (UniqueName: "kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ssh-key-openstack-edpm-ipam") pod "cbffc6a1-81df-479c-b40e-3f865c187a73" (UID: "cbffc6a1-81df-479c-b40e-3f865c187a73") : error deleting /var/lib/kubelet/pods/cbffc6a1-81df-479c-b40e-3f865c187a73/volume-subpaths: remove /var/lib/kubelet/pods/cbffc6a1-81df-479c-b40e-3f865c187a73/volume-subpaths: no such file or directory Mar 09 13:59:18 crc kubenswrapper[4764]: I0309 13:59:18.946268 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-inventory" (OuterVolumeSpecName: "inventory") pod "cbffc6a1-81df-479c-b40e-3f865c187a73" (UID: "cbffc6a1-81df-479c-b40e-3f865c187a73"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.017043 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.017087 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs48l\" (UniqueName: \"kubernetes.io/projected/cbffc6a1-81df-479c-b40e-3f865c187a73-kube-api-access-xs48l\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.017099 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.399839 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" event={"ID":"cbffc6a1-81df-479c-b40e-3f865c187a73","Type":"ContainerDied","Data":"2851c1e5435a239fbe006f90989fb7fbe06dda075e0fa33347b764f78afacfae"} Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.400330 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2851c1e5435a239fbe006f90989fb7fbe06dda075e0fa33347b764f78afacfae" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.399923 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.528308 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ssh-key-openstack-edpm-ipam\") pod \"cbffc6a1-81df-479c-b40e-3f865c187a73\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.528431 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl"] Mar 09 13:59:19 crc kubenswrapper[4764]: E0309 13:59:19.528988 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbffc6a1-81df-479c-b40e-3f865c187a73" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.529013 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbffc6a1-81df-479c-b40e-3f865c187a73" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.529225 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbffc6a1-81df-479c-b40e-3f865c187a73" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.530098 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.535565 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cbffc6a1-81df-479c-b40e-3f865c187a73" (UID: "cbffc6a1-81df-479c-b40e-3f865c187a73"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.548438 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl"] Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.631606 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.631730 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.631789 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.632360 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t9pd\" (UniqueName: \"kubernetes.io/projected/ea3a2b04-e009-4dcd-8eca-543cc084b329-kube-api-access-4t9pd\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.632834 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.735200 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.735373 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t9pd\" (UniqueName: \"kubernetes.io/projected/ea3a2b04-e009-4dcd-8eca-543cc084b329-kube-api-access-4t9pd\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.735448 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.735515 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.740018 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.740213 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.741458 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.763228 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t9pd\" (UniqueName: \"kubernetes.io/projected/ea3a2b04-e009-4dcd-8eca-543cc084b329-kube-api-access-4t9pd\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.896231 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:20 crc kubenswrapper[4764]: I0309 13:59:20.509278 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl"] Mar 09 13:59:20 crc kubenswrapper[4764]: W0309 13:59:20.513208 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea3a2b04_e009_4dcd_8eca_543cc084b329.slice/crio-29a0e5d6218a8119d39ffa3dd74128ff15504a3840777dcd48ebe0d5cf642624 WatchSource:0}: Error finding container 29a0e5d6218a8119d39ffa3dd74128ff15504a3840777dcd48ebe0d5cf642624: Status 404 returned error can't find the container with id 29a0e5d6218a8119d39ffa3dd74128ff15504a3840777dcd48ebe0d5cf642624 Mar 09 13:59:21 crc kubenswrapper[4764]: I0309 13:59:21.425244 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" event={"ID":"ea3a2b04-e009-4dcd-8eca-543cc084b329","Type":"ContainerStarted","Data":"91d47ab481b9f29f3401d128d19aa21a508e0899ce7b78b79fba40dc480e7870"} Mar 09 13:59:21 crc kubenswrapper[4764]: I0309 13:59:21.425667 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" event={"ID":"ea3a2b04-e009-4dcd-8eca-543cc084b329","Type":"ContainerStarted","Data":"29a0e5d6218a8119d39ffa3dd74128ff15504a3840777dcd48ebe0d5cf642624"} Mar 09 13:59:21 crc kubenswrapper[4764]: I0309 13:59:21.462631 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" podStartSLOduration=2.026957245 podStartE2EDuration="2.462602359s" podCreationTimestamp="2026-03-09 13:59:19 +0000 UTC" firstStartedPulling="2026-03-09 13:59:20.517278476 +0000 UTC m=+2315.767450404" lastFinishedPulling="2026-03-09 13:59:20.95292361 +0000 UTC m=+2316.203095518" observedRunningTime="2026-03-09 13:59:21.452082779 +0000 UTC m=+2316.702254697" watchObservedRunningTime="2026-03-09 13:59:21.462602359 +0000 UTC m=+2316.712774267" Mar 09 13:59:25 crc kubenswrapper[4764]: I0309 13:59:25.567986 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 13:59:25 crc kubenswrapper[4764]: E0309 13:59:25.569012 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:59:34 crc kubenswrapper[4764]: I0309 13:59:34.938869 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9bhcc"] Mar 09 13:59:34 crc kubenswrapper[4764]: I0309 13:59:34.942586 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:34 crc kubenswrapper[4764]: I0309 13:59:34.951212 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bhcc"] Mar 09 13:59:34 crc kubenswrapper[4764]: I0309 13:59:34.975718 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bbjx\" (UniqueName: \"kubernetes.io/projected/9d992693-633d-4d51-9c8d-965e2ee308f6-kube-api-access-5bbjx\") pod \"redhat-marketplace-9bhcc\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:34 crc kubenswrapper[4764]: I0309 13:59:34.975829 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-catalog-content\") pod \"redhat-marketplace-9bhcc\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:34 crc kubenswrapper[4764]: I0309 13:59:34.975956 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-utilities\") pod \"redhat-marketplace-9bhcc\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:35 crc kubenswrapper[4764]: I0309 13:59:35.078382 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-catalog-content\") pod \"redhat-marketplace-9bhcc\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:35 crc kubenswrapper[4764]: I0309 13:59:35.078526 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-utilities\") pod \"redhat-marketplace-9bhcc\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:35 crc kubenswrapper[4764]: I0309 13:59:35.078621 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bbjx\" (UniqueName: \"kubernetes.io/projected/9d992693-633d-4d51-9c8d-965e2ee308f6-kube-api-access-5bbjx\") pod \"redhat-marketplace-9bhcc\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:35 crc kubenswrapper[4764]: I0309 13:59:35.078895 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-catalog-content\") pod \"redhat-marketplace-9bhcc\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:35 crc kubenswrapper[4764]: I0309 13:59:35.079212 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-utilities\") pod \"redhat-marketplace-9bhcc\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:35 crc kubenswrapper[4764]: I0309 13:59:35.108166 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bbjx\" (UniqueName: \"kubernetes.io/projected/9d992693-633d-4d51-9c8d-965e2ee308f6-kube-api-access-5bbjx\") pod \"redhat-marketplace-9bhcc\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:35 crc kubenswrapper[4764]: I0309 13:59:35.278240 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:35 crc kubenswrapper[4764]: I0309 13:59:35.626385 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bhcc"] Mar 09 13:59:35 crc kubenswrapper[4764]: W0309 13:59:35.647407 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d992693_633d_4d51_9c8d_965e2ee308f6.slice/crio-9b55baf19f6fada22e37fe1cc527b71c3a02abd3b4bda4dc7937acfff5ac91bf WatchSource:0}: Error finding container 9b55baf19f6fada22e37fe1cc527b71c3a02abd3b4bda4dc7937acfff5ac91bf: Status 404 returned error can't find the container with id 9b55baf19f6fada22e37fe1cc527b71c3a02abd3b4bda4dc7937acfff5ac91bf Mar 09 13:59:36 crc kubenswrapper[4764]: I0309 13:59:36.561714 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerID="10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9" exitCode=0 Mar 09 13:59:36 crc kubenswrapper[4764]: I0309 13:59:36.561791 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhcc" event={"ID":"9d992693-633d-4d51-9c8d-965e2ee308f6","Type":"ContainerDied","Data":"10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9"} Mar 09 13:59:36 crc kubenswrapper[4764]: I0309 13:59:36.561856 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhcc" event={"ID":"9d992693-633d-4d51-9c8d-965e2ee308f6","Type":"ContainerStarted","Data":"9b55baf19f6fada22e37fe1cc527b71c3a02abd3b4bda4dc7937acfff5ac91bf"} Mar 09 13:59:37 crc kubenswrapper[4764]: I0309 13:59:37.559638 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 13:59:37 crc kubenswrapper[4764]: E0309 13:59:37.560427 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:59:37 crc kubenswrapper[4764]: I0309 13:59:37.575055 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhcc" event={"ID":"9d992693-633d-4d51-9c8d-965e2ee308f6","Type":"ContainerStarted","Data":"5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4"} Mar 09 13:59:38 crc kubenswrapper[4764]: I0309 13:59:38.613811 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerID="5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4" exitCode=0 Mar 09 13:59:38 crc kubenswrapper[4764]: I0309 13:59:38.613928 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhcc" event={"ID":"9d992693-633d-4d51-9c8d-965e2ee308f6","Type":"ContainerDied","Data":"5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4"} Mar 09 13:59:39 crc kubenswrapper[4764]: I0309 13:59:39.628523 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhcc" event={"ID":"9d992693-633d-4d51-9c8d-965e2ee308f6","Type":"ContainerStarted","Data":"c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0"} Mar 09 13:59:39 crc kubenswrapper[4764]: I0309 13:59:39.653178 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9bhcc" podStartSLOduration=3.119546915 podStartE2EDuration="5.653151056s" podCreationTimestamp="2026-03-09 13:59:34 +0000 UTC" firstStartedPulling="2026-03-09 13:59:36.564545397 +0000 UTC m=+2331.814717305" lastFinishedPulling="2026-03-09 13:59:39.098149538 +0000 UTC m=+2334.348321446" observedRunningTime="2026-03-09 13:59:39.651483382 +0000 UTC m=+2334.901655310" watchObservedRunningTime="2026-03-09 13:59:39.653151056 +0000 UTC m=+2334.903322984" Mar 09 13:59:45 crc kubenswrapper[4764]: I0309 13:59:45.279709 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:45 crc kubenswrapper[4764]: I0309 13:59:45.280793 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:45 crc kubenswrapper[4764]: I0309 13:59:45.339120 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:45 crc kubenswrapper[4764]: I0309 13:59:45.758090 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:45 crc kubenswrapper[4764]: I0309 13:59:45.821395 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bhcc"] Mar 09 13:59:47 crc kubenswrapper[4764]: I0309 13:59:47.716204 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9bhcc" podUID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerName="registry-server" containerID="cri-o://c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0" gracePeriod=2 Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.560401 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 13:59:48 crc kubenswrapper[4764]: E0309 13:59:48.561219 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.706391 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.730589 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerID="c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0" exitCode=0 Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.730705 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhcc" event={"ID":"9d992693-633d-4d51-9c8d-965e2ee308f6","Type":"ContainerDied","Data":"c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0"} Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.730816 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhcc" event={"ID":"9d992693-633d-4d51-9c8d-965e2ee308f6","Type":"ContainerDied","Data":"9b55baf19f6fada22e37fe1cc527b71c3a02abd3b4bda4dc7937acfff5ac91bf"} Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.730847 4764 scope.go:117] "RemoveContainer" containerID="c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.730737 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.771111 4764 scope.go:117] "RemoveContainer" containerID="5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.796334 4764 scope.go:117] "RemoveContainer" containerID="10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.818000 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bbjx\" (UniqueName: \"kubernetes.io/projected/9d992693-633d-4d51-9c8d-965e2ee308f6-kube-api-access-5bbjx\") pod \"9d992693-633d-4d51-9c8d-965e2ee308f6\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.818075 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-utilities\") pod \"9d992693-633d-4d51-9c8d-965e2ee308f6\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.818371 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-catalog-content\") pod \"9d992693-633d-4d51-9c8d-965e2ee308f6\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.819528 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-utilities" (OuterVolumeSpecName: "utilities") pod "9d992693-633d-4d51-9c8d-965e2ee308f6" (UID: "9d992693-633d-4d51-9c8d-965e2ee308f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.831960 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d992693-633d-4d51-9c8d-965e2ee308f6-kube-api-access-5bbjx" (OuterVolumeSpecName: "kube-api-access-5bbjx") pod "9d992693-633d-4d51-9c8d-965e2ee308f6" (UID: "9d992693-633d-4d51-9c8d-965e2ee308f6"). InnerVolumeSpecName "kube-api-access-5bbjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.849290 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d992693-633d-4d51-9c8d-965e2ee308f6" (UID: "9d992693-633d-4d51-9c8d-965e2ee308f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.851046 4764 scope.go:117] "RemoveContainer" containerID="c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0" Mar 09 13:59:48 crc kubenswrapper[4764]: E0309 13:59:48.852860 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0\": container with ID starting with c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0 not found: ID does not exist" containerID="c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.852901 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0"} err="failed to get container status \"c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0\": rpc error: code = NotFound desc = could not find container \"c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0\": container with ID starting with c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0 not found: ID does not exist" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.852929 4764 scope.go:117] "RemoveContainer" containerID="5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4" Mar 09 13:59:48 crc kubenswrapper[4764]: E0309 13:59:48.853516 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4\": container with ID starting with 5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4 not found: ID does not exist" containerID="5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.853564 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4"} err="failed to get container status \"5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4\": rpc error: code = NotFound desc = could not find container \"5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4\": container with ID starting with 5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4 not found: ID does not exist" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.853598 4764 scope.go:117] "RemoveContainer" containerID="10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9" Mar 09 13:59:48 crc kubenswrapper[4764]: E0309 13:59:48.854554 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9\": container with ID starting with 10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9 not found: ID does not exist" containerID="10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.854626 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9"} err="failed to get container status \"10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9\": rpc error: code = NotFound desc = could not find container \"10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9\": container with ID starting with 10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9 not found: ID does not exist" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.921730 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bbjx\" (UniqueName: \"kubernetes.io/projected/9d992693-633d-4d51-9c8d-965e2ee308f6-kube-api-access-5bbjx\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.921772 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.921786 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:49 crc kubenswrapper[4764]: I0309 13:59:49.075754 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bhcc"] Mar 09 13:59:49 crc kubenswrapper[4764]: I0309 13:59:49.088169 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bhcc"] Mar 09 13:59:49 crc kubenswrapper[4764]: E0309 13:59:49.224192 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d992693_633d_4d51_9c8d_965e2ee308f6.slice/crio-9b55baf19f6fada22e37fe1cc527b71c3a02abd3b4bda4dc7937acfff5ac91bf\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d992693_633d_4d51_9c8d_965e2ee308f6.slice\": RecentStats: unable to find data in memory cache]" Mar 09 13:59:49 crc kubenswrapper[4764]: I0309 13:59:49.574093 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d992693-633d-4d51-9c8d-965e2ee308f6" path="/var/lib/kubelet/pods/9d992693-633d-4d51-9c8d-965e2ee308f6/volumes" Mar 09 13:59:57 crc kubenswrapper[4764]: I0309 13:59:57.837918 4764 generic.go:334] "Generic (PLEG): container finished" podID="ea3a2b04-e009-4dcd-8eca-543cc084b329" containerID="91d47ab481b9f29f3401d128d19aa21a508e0899ce7b78b79fba40dc480e7870" exitCode=0 Mar 09 13:59:57 crc kubenswrapper[4764]: I0309 13:59:57.838550 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" event={"ID":"ea3a2b04-e009-4dcd-8eca-543cc084b329","Type":"ContainerDied","Data":"91d47ab481b9f29f3401d128d19aa21a508e0899ce7b78b79fba40dc480e7870"} Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.280794 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.352019 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ceph\") pod \"ea3a2b04-e009-4dcd-8eca-543cc084b329\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.352313 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-inventory\") pod \"ea3a2b04-e009-4dcd-8eca-543cc084b329\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.352373 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t9pd\" (UniqueName: \"kubernetes.io/projected/ea3a2b04-e009-4dcd-8eca-543cc084b329-kube-api-access-4t9pd\") pod \"ea3a2b04-e009-4dcd-8eca-543cc084b329\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.352403 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ssh-key-openstack-edpm-ipam\") pod \"ea3a2b04-e009-4dcd-8eca-543cc084b329\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.360500 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ceph" (OuterVolumeSpecName: "ceph") pod "ea3a2b04-e009-4dcd-8eca-543cc084b329" (UID: "ea3a2b04-e009-4dcd-8eca-543cc084b329"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.360937 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea3a2b04-e009-4dcd-8eca-543cc084b329-kube-api-access-4t9pd" (OuterVolumeSpecName: "kube-api-access-4t9pd") pod "ea3a2b04-e009-4dcd-8eca-543cc084b329" (UID: "ea3a2b04-e009-4dcd-8eca-543cc084b329"). InnerVolumeSpecName "kube-api-access-4t9pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.385737 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ea3a2b04-e009-4dcd-8eca-543cc084b329" (UID: "ea3a2b04-e009-4dcd-8eca-543cc084b329"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.400961 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-inventory" (OuterVolumeSpecName: "inventory") pod "ea3a2b04-e009-4dcd-8eca-543cc084b329" (UID: "ea3a2b04-e009-4dcd-8eca-543cc084b329"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.457747 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.458398 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t9pd\" (UniqueName: \"kubernetes.io/projected/ea3a2b04-e009-4dcd-8eca-543cc084b329-kube-api-access-4t9pd\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.458416 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.458430 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.869631 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" event={"ID":"ea3a2b04-e009-4dcd-8eca-543cc084b329","Type":"ContainerDied","Data":"29a0e5d6218a8119d39ffa3dd74128ff15504a3840777dcd48ebe0d5cf642624"} Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.869703 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29a0e5d6218a8119d39ffa3dd74128ff15504a3840777dcd48ebe0d5cf642624" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.869805 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.984018 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x85q2"] Mar 09 13:59:59 crc kubenswrapper[4764]: E0309 13:59:59.984580 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerName="extract-content" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.984600 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerName="extract-content" Mar 09 13:59:59 crc kubenswrapper[4764]: E0309 13:59:59.984616 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerName="extract-utilities" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.984624 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerName="extract-utilities" Mar 09 13:59:59 crc kubenswrapper[4764]: E0309 13:59:59.984659 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3a2b04-e009-4dcd-8eca-543cc084b329" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.984667 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3a2b04-e009-4dcd-8eca-543cc084b329" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:59:59 crc kubenswrapper[4764]: E0309 13:59:59.984693 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerName="registry-server" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.984700 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerName="registry-server" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.984895 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerName="registry-server" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.984915 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3a2b04-e009-4dcd-8eca-543cc084b329" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.985815 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.992391 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.992835 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.992863 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.992908 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.995048 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.998254 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x85q2"] Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.076402 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2q9z\" (UniqueName: \"kubernetes.io/projected/23319545-4107-4a83-b7e1-955e4648bf7b-kube-api-access-q2q9z\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.076471 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.076537 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ceph\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.076606 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.139894 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551080-svz2w"] Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.141583 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551080-svz2w" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.144230 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.144263 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.144312 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.151111 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551080-svz2w"] Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.179170 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2q9z\" (UniqueName: \"kubernetes.io/projected/23319545-4107-4a83-b7e1-955e4648bf7b-kube-api-access-q2q9z\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.179246 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.179290 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ceph\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.179328 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.185757 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ceph\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.186063 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.186676 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.196564 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2q9z\" (UniqueName: \"kubernetes.io/projected/23319545-4107-4a83-b7e1-955e4648bf7b-kube-api-access-q2q9z\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.245136 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g"] Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.247929 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.253234 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.253567 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.257977 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g"] Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.281076 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8vx5\" (UniqueName: \"kubernetes.io/projected/f6eebc0e-7e89-4489-b808-7eebf0e54dca-kube-api-access-d8vx5\") pod \"auto-csr-approver-29551080-svz2w\" (UID: \"f6eebc0e-7e89-4489-b808-7eebf0e54dca\") " pod="openshift-infra/auto-csr-approver-29551080-svz2w" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.309695 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.383597 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f13122f-94d3-47ba-9c7c-989ebe96468e-secret-volume\") pod \"collect-profiles-29551080-n964g\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.383704 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f13122f-94d3-47ba-9c7c-989ebe96468e-config-volume\") pod \"collect-profiles-29551080-n964g\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.384174 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5dvz\" (UniqueName: \"kubernetes.io/projected/4f13122f-94d3-47ba-9c7c-989ebe96468e-kube-api-access-d5dvz\") pod \"collect-profiles-29551080-n964g\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.384563 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8vx5\" (UniqueName: \"kubernetes.io/projected/f6eebc0e-7e89-4489-b808-7eebf0e54dca-kube-api-access-d8vx5\") pod \"auto-csr-approver-29551080-svz2w\" (UID: \"f6eebc0e-7e89-4489-b808-7eebf0e54dca\") " pod="openshift-infra/auto-csr-approver-29551080-svz2w" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.412858 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8vx5\" (UniqueName: \"kubernetes.io/projected/f6eebc0e-7e89-4489-b808-7eebf0e54dca-kube-api-access-d8vx5\") pod \"auto-csr-approver-29551080-svz2w\" (UID: \"f6eebc0e-7e89-4489-b808-7eebf0e54dca\") " pod="openshift-infra/auto-csr-approver-29551080-svz2w" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.460207 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551080-svz2w" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.486356 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f13122f-94d3-47ba-9c7c-989ebe96468e-secret-volume\") pod \"collect-profiles-29551080-n964g\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.486463 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f13122f-94d3-47ba-9c7c-989ebe96468e-config-volume\") pod \"collect-profiles-29551080-n964g\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.486557 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5dvz\" (UniqueName: \"kubernetes.io/projected/4f13122f-94d3-47ba-9c7c-989ebe96468e-kube-api-access-d5dvz\") pod \"collect-profiles-29551080-n964g\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.487814 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f13122f-94d3-47ba-9c7c-989ebe96468e-config-volume\") pod \"collect-profiles-29551080-n964g\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.490387 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f13122f-94d3-47ba-9c7c-989ebe96468e-secret-volume\") pod \"collect-profiles-29551080-n964g\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.511806 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5dvz\" (UniqueName: \"kubernetes.io/projected/4f13122f-94d3-47ba-9c7c-989ebe96468e-kube-api-access-d5dvz\") pod \"collect-profiles-29551080-n964g\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.596780 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.888202 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x85q2"] Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.903606 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551080-svz2w"] Mar 09 14:00:00 crc kubenswrapper[4764]: W0309 14:00:00.905458 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6eebc0e_7e89_4489_b808_7eebf0e54dca.slice/crio-b89a492d7501ec70f38c24543ed56396567de1540f48bf44ac303cff316d4171 WatchSource:0}: Error finding container b89a492d7501ec70f38c24543ed56396567de1540f48bf44ac303cff316d4171: Status 404 returned error can't find the container with id b89a492d7501ec70f38c24543ed56396567de1540f48bf44ac303cff316d4171 Mar 09 14:00:01 crc kubenswrapper[4764]: I0309 14:00:01.039782 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g"] Mar 09 14:00:01 crc kubenswrapper[4764]: W0309 14:00:01.043198 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f13122f_94d3_47ba_9c7c_989ebe96468e.slice/crio-243e774d47f8d9a657e2bda283d0e4b662d6968f445495e7b80e3a737580d3e8 WatchSource:0}: Error finding container 243e774d47f8d9a657e2bda283d0e4b662d6968f445495e7b80e3a737580d3e8: Status 404 returned error can't find the container with id 243e774d47f8d9a657e2bda283d0e4b662d6968f445495e7b80e3a737580d3e8 Mar 09 14:00:01 crc kubenswrapper[4764]: I0309 14:00:01.560730 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:00:01 crc kubenswrapper[4764]: E0309 14:00:01.561504 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:00:01 crc kubenswrapper[4764]: I0309 14:00:01.902724 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" event={"ID":"23319545-4107-4a83-b7e1-955e4648bf7b","Type":"ContainerStarted","Data":"8012376628a2434c25ec25a001db6c303665a134e1d1b05ef3e255939acfe13c"} Mar 09 14:00:01 crc kubenswrapper[4764]: I0309 14:00:01.903143 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" event={"ID":"23319545-4107-4a83-b7e1-955e4648bf7b","Type":"ContainerStarted","Data":"892d7bf1897aed2e0545cb05dd3f303e9e4c066119aefab5bda9572a1af4261c"} Mar 09 14:00:01 crc kubenswrapper[4764]: I0309 14:00:01.908474 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551080-svz2w" event={"ID":"f6eebc0e-7e89-4489-b808-7eebf0e54dca","Type":"ContainerStarted","Data":"b89a492d7501ec70f38c24543ed56396567de1540f48bf44ac303cff316d4171"} Mar 09 14:00:01 crc kubenswrapper[4764]: I0309 14:00:01.912764 4764 generic.go:334] "Generic (PLEG): container finished" podID="4f13122f-94d3-47ba-9c7c-989ebe96468e" containerID="619f9f75884e6a89c52f4fcf55d2dfba6aa2fb01bdb5db49c229617e54ab7608" exitCode=0 Mar 09 14:00:01 crc kubenswrapper[4764]: I0309 14:00:01.912879 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" event={"ID":"4f13122f-94d3-47ba-9c7c-989ebe96468e","Type":"ContainerDied","Data":"619f9f75884e6a89c52f4fcf55d2dfba6aa2fb01bdb5db49c229617e54ab7608"} Mar 09 14:00:01 crc kubenswrapper[4764]: I0309 14:00:01.912994 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" event={"ID":"4f13122f-94d3-47ba-9c7c-989ebe96468e","Type":"ContainerStarted","Data":"243e774d47f8d9a657e2bda283d0e4b662d6968f445495e7b80e3a737580d3e8"} Mar 09 14:00:01 crc kubenswrapper[4764]: I0309 14:00:01.927379 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" podStartSLOduration=2.359672535 podStartE2EDuration="2.927351282s" podCreationTimestamp="2026-03-09 13:59:59 +0000 UTC" firstStartedPulling="2026-03-09 14:00:00.893425006 +0000 UTC m=+2356.143596904" lastFinishedPulling="2026-03-09 14:00:01.461103743 +0000 UTC m=+2356.711275651" observedRunningTime="2026-03-09 14:00:01.925053351 +0000 UTC m=+2357.175225259" watchObservedRunningTime="2026-03-09 14:00:01.927351282 +0000 UTC m=+2357.177523190" Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.265061 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.351755 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f13122f-94d3-47ba-9c7c-989ebe96468e-secret-volume\") pod \"4f13122f-94d3-47ba-9c7c-989ebe96468e\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.351852 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5dvz\" (UniqueName: \"kubernetes.io/projected/4f13122f-94d3-47ba-9c7c-989ebe96468e-kube-api-access-d5dvz\") pod \"4f13122f-94d3-47ba-9c7c-989ebe96468e\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.351923 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f13122f-94d3-47ba-9c7c-989ebe96468e-config-volume\") pod \"4f13122f-94d3-47ba-9c7c-989ebe96468e\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.353251 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f13122f-94d3-47ba-9c7c-989ebe96468e-config-volume" (OuterVolumeSpecName: "config-volume") pod "4f13122f-94d3-47ba-9c7c-989ebe96468e" (UID: "4f13122f-94d3-47ba-9c7c-989ebe96468e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.360060 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f13122f-94d3-47ba-9c7c-989ebe96468e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4f13122f-94d3-47ba-9c7c-989ebe96468e" (UID: "4f13122f-94d3-47ba-9c7c-989ebe96468e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.361520 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f13122f-94d3-47ba-9c7c-989ebe96468e-kube-api-access-d5dvz" (OuterVolumeSpecName: "kube-api-access-d5dvz") pod "4f13122f-94d3-47ba-9c7c-989ebe96468e" (UID: "4f13122f-94d3-47ba-9c7c-989ebe96468e"). InnerVolumeSpecName "kube-api-access-d5dvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.454906 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f13122f-94d3-47ba-9c7c-989ebe96468e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.454961 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5dvz\" (UniqueName: \"kubernetes.io/projected/4f13122f-94d3-47ba-9c7c-989ebe96468e-kube-api-access-d5dvz\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.454974 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f13122f-94d3-47ba-9c7c-989ebe96468e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.933836 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" event={"ID":"4f13122f-94d3-47ba-9c7c-989ebe96468e","Type":"ContainerDied","Data":"243e774d47f8d9a657e2bda283d0e4b662d6968f445495e7b80e3a737580d3e8"} Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.934295 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="243e774d47f8d9a657e2bda283d0e4b662d6968f445495e7b80e3a737580d3e8" Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.933893 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:04 crc kubenswrapper[4764]: I0309 14:00:04.360991 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn"] Mar 09 14:00:04 crc kubenswrapper[4764]: I0309 14:00:04.376887 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn"] Mar 09 14:00:04 crc kubenswrapper[4764]: I0309 14:00:04.945484 4764 generic.go:334] "Generic (PLEG): container finished" podID="f6eebc0e-7e89-4489-b808-7eebf0e54dca" containerID="55591ff62c6b7e1f8299cbaa3c157dfe4ead0808f086d1b63edc0dcadd27bf4a" exitCode=0 Mar 09 14:00:04 crc kubenswrapper[4764]: I0309 14:00:04.945554 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551080-svz2w" event={"ID":"f6eebc0e-7e89-4489-b808-7eebf0e54dca","Type":"ContainerDied","Data":"55591ff62c6b7e1f8299cbaa3c157dfe4ead0808f086d1b63edc0dcadd27bf4a"} Mar 09 14:00:05 crc kubenswrapper[4764]: I0309 14:00:05.574100 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39da5087-79bc-4154-b340-22183d9e4417" path="/var/lib/kubelet/pods/39da5087-79bc-4154-b340-22183d9e4417/volumes" Mar 09 14:00:06 crc kubenswrapper[4764]: I0309 14:00:06.299061 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551080-svz2w" Mar 09 14:00:06 crc kubenswrapper[4764]: I0309 14:00:06.450870 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8vx5\" (UniqueName: \"kubernetes.io/projected/f6eebc0e-7e89-4489-b808-7eebf0e54dca-kube-api-access-d8vx5\") pod \"f6eebc0e-7e89-4489-b808-7eebf0e54dca\" (UID: \"f6eebc0e-7e89-4489-b808-7eebf0e54dca\") " Mar 09 14:00:06 crc kubenswrapper[4764]: I0309 14:00:06.458370 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6eebc0e-7e89-4489-b808-7eebf0e54dca-kube-api-access-d8vx5" (OuterVolumeSpecName: "kube-api-access-d8vx5") pod "f6eebc0e-7e89-4489-b808-7eebf0e54dca" (UID: "f6eebc0e-7e89-4489-b808-7eebf0e54dca"). InnerVolumeSpecName "kube-api-access-d8vx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:00:06 crc kubenswrapper[4764]: I0309 14:00:06.553680 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8vx5\" (UniqueName: \"kubernetes.io/projected/f6eebc0e-7e89-4489-b808-7eebf0e54dca-kube-api-access-d8vx5\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:06 crc kubenswrapper[4764]: I0309 14:00:06.973707 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551080-svz2w" event={"ID":"f6eebc0e-7e89-4489-b808-7eebf0e54dca","Type":"ContainerDied","Data":"b89a492d7501ec70f38c24543ed56396567de1540f48bf44ac303cff316d4171"} Mar 09 14:00:06 crc kubenswrapper[4764]: I0309 14:00:06.973765 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b89a492d7501ec70f38c24543ed56396567de1540f48bf44ac303cff316d4171" Mar 09 14:00:06 crc kubenswrapper[4764]: I0309 14:00:06.973765 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551080-svz2w" Mar 09 14:00:07 crc kubenswrapper[4764]: I0309 14:00:07.371853 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551074-ztbcz"] Mar 09 14:00:07 crc kubenswrapper[4764]: I0309 14:00:07.407628 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551074-ztbcz"] Mar 09 14:00:07 crc kubenswrapper[4764]: I0309 14:00:07.571858 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="909c58d5-d4d7-4042-94f0-df77bda9590a" path="/var/lib/kubelet/pods/909c58d5-d4d7-4042-94f0-df77bda9590a/volumes" Mar 09 14:00:09 crc kubenswrapper[4764]: I0309 14:00:09.051172 4764 scope.go:117] "RemoveContainer" containerID="45383c97959a17ffdeaa9f0ab6e8a1110b113c90d84de0fa490663169c04fa26" Mar 09 14:00:09 crc kubenswrapper[4764]: I0309 14:00:09.083080 4764 scope.go:117] "RemoveContainer" containerID="5510a27aa618536b31f12ced254c914aa21f71e4d8962e547b119bb29d1548f1" Mar 09 14:00:10 crc kubenswrapper[4764]: I0309 14:00:10.004852 4764 generic.go:334] "Generic (PLEG): container finished" podID="23319545-4107-4a83-b7e1-955e4648bf7b" containerID="8012376628a2434c25ec25a001db6c303665a134e1d1b05ef3e255939acfe13c" exitCode=0 Mar 09 14:00:10 crc kubenswrapper[4764]: I0309 14:00:10.004923 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" event={"ID":"23319545-4107-4a83-b7e1-955e4648bf7b","Type":"ContainerDied","Data":"8012376628a2434c25ec25a001db6c303665a134e1d1b05ef3e255939acfe13c"} Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.451312 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.560222 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ceph\") pod \"23319545-4107-4a83-b7e1-955e4648bf7b\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.560806 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-inventory-0\") pod \"23319545-4107-4a83-b7e1-955e4648bf7b\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.560882 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2q9z\" (UniqueName: \"kubernetes.io/projected/23319545-4107-4a83-b7e1-955e4648bf7b-kube-api-access-q2q9z\") pod \"23319545-4107-4a83-b7e1-955e4648bf7b\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.560925 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ssh-key-openstack-edpm-ipam\") pod \"23319545-4107-4a83-b7e1-955e4648bf7b\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.566832 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ceph" (OuterVolumeSpecName: "ceph") pod "23319545-4107-4a83-b7e1-955e4648bf7b" (UID: "23319545-4107-4a83-b7e1-955e4648bf7b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.567496 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23319545-4107-4a83-b7e1-955e4648bf7b-kube-api-access-q2q9z" (OuterVolumeSpecName: "kube-api-access-q2q9z") pod "23319545-4107-4a83-b7e1-955e4648bf7b" (UID: "23319545-4107-4a83-b7e1-955e4648bf7b"). InnerVolumeSpecName "kube-api-access-q2q9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.588268 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "23319545-4107-4a83-b7e1-955e4648bf7b" (UID: "23319545-4107-4a83-b7e1-955e4648bf7b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.593704 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "23319545-4107-4a83-b7e1-955e4648bf7b" (UID: "23319545-4107-4a83-b7e1-955e4648bf7b"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.665838 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.665879 4764 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.665893 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2q9z\" (UniqueName: \"kubernetes.io/projected/23319545-4107-4a83-b7e1-955e4648bf7b-kube-api-access-q2q9z\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.665907 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.028321 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" event={"ID":"23319545-4107-4a83-b7e1-955e4648bf7b","Type":"ContainerDied","Data":"892d7bf1897aed2e0545cb05dd3f303e9e4c066119aefab5bda9572a1af4261c"} Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.028380 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="892d7bf1897aed2e0545cb05dd3f303e9e4c066119aefab5bda9572a1af4261c" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.028473 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.113088 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6"] Mar 09 14:00:12 crc kubenswrapper[4764]: E0309 14:00:12.113707 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6eebc0e-7e89-4489-b808-7eebf0e54dca" containerName="oc" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.113735 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6eebc0e-7e89-4489-b808-7eebf0e54dca" containerName="oc" Mar 09 14:00:12 crc kubenswrapper[4764]: E0309 14:00:12.113789 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23319545-4107-4a83-b7e1-955e4648bf7b" containerName="ssh-known-hosts-edpm-deployment" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.113802 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="23319545-4107-4a83-b7e1-955e4648bf7b" containerName="ssh-known-hosts-edpm-deployment" Mar 09 14:00:12 crc kubenswrapper[4764]: E0309 14:00:12.113818 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f13122f-94d3-47ba-9c7c-989ebe96468e" containerName="collect-profiles" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.113828 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f13122f-94d3-47ba-9c7c-989ebe96468e" containerName="collect-profiles" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.114083 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f13122f-94d3-47ba-9c7c-989ebe96468e" containerName="collect-profiles" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.114119 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="23319545-4107-4a83-b7e1-955e4648bf7b" containerName="ssh-known-hosts-edpm-deployment" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.114154 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6eebc0e-7e89-4489-b808-7eebf0e54dca" containerName="oc" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.115364 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.117716 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.118265 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.118436 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.119766 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.119813 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.126131 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6"] Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.282266 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.282574 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.282812 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zdw4\" (UniqueName: \"kubernetes.io/projected/942b7017-cdda-4d7a-8be8-521111f4fcd1-kube-api-access-5zdw4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.283198 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.385668 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.385820 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.385864 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zdw4\" (UniqueName: \"kubernetes.io/projected/942b7017-cdda-4d7a-8be8-521111f4fcd1-kube-api-access-5zdw4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.385952 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.391330 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.394604 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.403319 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zdw4\" (UniqueName: \"kubernetes.io/projected/942b7017-cdda-4d7a-8be8-521111f4fcd1-kube-api-access-5zdw4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.406246 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.434248 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.966774 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6"] Mar 09 14:00:13 crc kubenswrapper[4764]: I0309 14:00:13.040716 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" event={"ID":"942b7017-cdda-4d7a-8be8-521111f4fcd1","Type":"ContainerStarted","Data":"d92809745759714030457fe7c8af618ee88aa24996b77390082f9f223a4cefad"} Mar 09 14:00:14 crc kubenswrapper[4764]: I0309 14:00:14.560188 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:00:14 crc kubenswrapper[4764]: E0309 14:00:14.561159 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:00:15 crc kubenswrapper[4764]: I0309 14:00:15.057610 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" event={"ID":"942b7017-cdda-4d7a-8be8-521111f4fcd1","Type":"ContainerStarted","Data":"a06d71f1a77dceca0be2bf0b483e0d47e9a68e3152d9acd920b824104f7f40be"} Mar 09 14:00:15 crc kubenswrapper[4764]: I0309 14:00:15.081569 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" podStartSLOduration=2.190152158 podStartE2EDuration="3.08153967s" podCreationTimestamp="2026-03-09 14:00:12 +0000 UTC" firstStartedPulling="2026-03-09 14:00:12.97155852 +0000 UTC m=+2368.221730428" lastFinishedPulling="2026-03-09 14:00:13.862946032 +0000 UTC m=+2369.113117940" observedRunningTime="2026-03-09 14:00:15.07330291 +0000 UTC m=+2370.323474838" watchObservedRunningTime="2026-03-09 14:00:15.08153967 +0000 UTC m=+2370.331711598" Mar 09 14:00:21 crc kubenswrapper[4764]: I0309 14:00:21.118479 4764 generic.go:334] "Generic (PLEG): container finished" podID="942b7017-cdda-4d7a-8be8-521111f4fcd1" containerID="a06d71f1a77dceca0be2bf0b483e0d47e9a68e3152d9acd920b824104f7f40be" exitCode=0 Mar 09 14:00:21 crc kubenswrapper[4764]: I0309 14:00:21.118580 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" event={"ID":"942b7017-cdda-4d7a-8be8-521111f4fcd1","Type":"ContainerDied","Data":"a06d71f1a77dceca0be2bf0b483e0d47e9a68e3152d9acd920b824104f7f40be"} Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.570856 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.708645 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ssh-key-openstack-edpm-ipam\") pod \"942b7017-cdda-4d7a-8be8-521111f4fcd1\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.709264 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ceph\") pod \"942b7017-cdda-4d7a-8be8-521111f4fcd1\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.709478 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zdw4\" (UniqueName: \"kubernetes.io/projected/942b7017-cdda-4d7a-8be8-521111f4fcd1-kube-api-access-5zdw4\") pod \"942b7017-cdda-4d7a-8be8-521111f4fcd1\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.709610 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-inventory\") pod \"942b7017-cdda-4d7a-8be8-521111f4fcd1\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.716314 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ceph" (OuterVolumeSpecName: "ceph") pod "942b7017-cdda-4d7a-8be8-521111f4fcd1" (UID: "942b7017-cdda-4d7a-8be8-521111f4fcd1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.716850 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/942b7017-cdda-4d7a-8be8-521111f4fcd1-kube-api-access-5zdw4" (OuterVolumeSpecName: "kube-api-access-5zdw4") pod "942b7017-cdda-4d7a-8be8-521111f4fcd1" (UID: "942b7017-cdda-4d7a-8be8-521111f4fcd1"). InnerVolumeSpecName "kube-api-access-5zdw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.741153 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "942b7017-cdda-4d7a-8be8-521111f4fcd1" (UID: "942b7017-cdda-4d7a-8be8-521111f4fcd1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.749194 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-inventory" (OuterVolumeSpecName: "inventory") pod "942b7017-cdda-4d7a-8be8-521111f4fcd1" (UID: "942b7017-cdda-4d7a-8be8-521111f4fcd1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.813039 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zdw4\" (UniqueName: \"kubernetes.io/projected/942b7017-cdda-4d7a-8be8-521111f4fcd1-kube-api-access-5zdw4\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.813111 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.813124 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.813139 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.141254 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" event={"ID":"942b7017-cdda-4d7a-8be8-521111f4fcd1","Type":"ContainerDied","Data":"d92809745759714030457fe7c8af618ee88aa24996b77390082f9f223a4cefad"} Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.141311 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d92809745759714030457fe7c8af618ee88aa24996b77390082f9f223a4cefad" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.141350 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.225446 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg"] Mar 09 14:00:23 crc kubenswrapper[4764]: E0309 14:00:23.226177 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="942b7017-cdda-4d7a-8be8-521111f4fcd1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.226203 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="942b7017-cdda-4d7a-8be8-521111f4fcd1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.226451 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="942b7017-cdda-4d7a-8be8-521111f4fcd1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.227390 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.234351 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.234681 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.234757 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.234863 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.234641 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.237809 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg"] Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.324759 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22ckp\" (UniqueName: \"kubernetes.io/projected/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-kube-api-access-22ckp\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.324842 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.325156 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.325287 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.428070 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.428150 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.428213 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22ckp\" (UniqueName: \"kubernetes.io/projected/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-kube-api-access-22ckp\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.428256 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.434603 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.434603 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.434741 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.450607 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22ckp\" (UniqueName: \"kubernetes.io/projected/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-kube-api-access-22ckp\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.556487 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:24 crc kubenswrapper[4764]: I0309 14:00:24.325347 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg"] Mar 09 14:00:25 crc kubenswrapper[4764]: I0309 14:00:25.160761 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" event={"ID":"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300","Type":"ContainerStarted","Data":"4dd4c308fa6de7cca4815781796356da95e83110403baa4426c3bd13904126bc"} Mar 09 14:00:25 crc kubenswrapper[4764]: I0309 14:00:25.161172 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" event={"ID":"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300","Type":"ContainerStarted","Data":"ca149789a5aed112c6c479c7602abb7c51c2df6acad7ebd1db01364782405ce0"} Mar 09 14:00:25 crc kubenswrapper[4764]: I0309 14:00:25.194954 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" podStartSLOduration=1.75363846 podStartE2EDuration="2.19481523s" podCreationTimestamp="2026-03-09 14:00:23 +0000 UTC" firstStartedPulling="2026-03-09 14:00:24.337298593 +0000 UTC m=+2379.587470501" lastFinishedPulling="2026-03-09 14:00:24.778475363 +0000 UTC m=+2380.028647271" observedRunningTime="2026-03-09 14:00:25.178870824 +0000 UTC m=+2380.429042752" watchObservedRunningTime="2026-03-09 14:00:25.19481523 +0000 UTC m=+2380.444987138" Mar 09 14:00:27 crc kubenswrapper[4764]: I0309 14:00:27.560752 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:00:27 crc kubenswrapper[4764]: E0309 14:00:27.561695 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:00:34 crc kubenswrapper[4764]: I0309 14:00:34.247830 4764 generic.go:334] "Generic (PLEG): container finished" podID="6ee5c8cc-9f2b-42f8-aed5-37c3540bd300" containerID="4dd4c308fa6de7cca4815781796356da95e83110403baa4426c3bd13904126bc" exitCode=0 Mar 09 14:00:34 crc kubenswrapper[4764]: I0309 14:00:34.247957 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" event={"ID":"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300","Type":"ContainerDied","Data":"4dd4c308fa6de7cca4815781796356da95e83110403baa4426c3bd13904126bc"} Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.677275 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.850187 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22ckp\" (UniqueName: \"kubernetes.io/projected/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-kube-api-access-22ckp\") pod \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.850424 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ssh-key-openstack-edpm-ipam\") pod \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.850472 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ceph\") pod \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.850519 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-inventory\") pod \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.857167 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-kube-api-access-22ckp" (OuterVolumeSpecName: "kube-api-access-22ckp") pod "6ee5c8cc-9f2b-42f8-aed5-37c3540bd300" (UID: "6ee5c8cc-9f2b-42f8-aed5-37c3540bd300"). InnerVolumeSpecName "kube-api-access-22ckp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.859329 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ceph" (OuterVolumeSpecName: "ceph") pod "6ee5c8cc-9f2b-42f8-aed5-37c3540bd300" (UID: "6ee5c8cc-9f2b-42f8-aed5-37c3540bd300"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.882585 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6ee5c8cc-9f2b-42f8-aed5-37c3540bd300" (UID: "6ee5c8cc-9f2b-42f8-aed5-37c3540bd300"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.882923 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-inventory" (OuterVolumeSpecName: "inventory") pod "6ee5c8cc-9f2b-42f8-aed5-37c3540bd300" (UID: "6ee5c8cc-9f2b-42f8-aed5-37c3540bd300"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.953732 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22ckp\" (UniqueName: \"kubernetes.io/projected/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-kube-api-access-22ckp\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.953796 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.953811 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.953831 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.269967 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" event={"ID":"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300","Type":"ContainerDied","Data":"ca149789a5aed112c6c479c7602abb7c51c2df6acad7ebd1db01364782405ce0"} Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.270017 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca149789a5aed112c6c479c7602abb7c51c2df6acad7ebd1db01364782405ce0" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.270036 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.369956 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s"] Mar 09 14:00:36 crc kubenswrapper[4764]: E0309 14:00:36.370423 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee5c8cc-9f2b-42f8-aed5-37c3540bd300" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.370444 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee5c8cc-9f2b-42f8-aed5-37c3540bd300" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.370638 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ee5c8cc-9f2b-42f8-aed5-37c3540bd300" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.371474 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.374734 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.374734 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.375230 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.375896 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.375943 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.376268 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.377795 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.382377 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.386929 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s"] Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.565976 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zx9k\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-kube-api-access-6zx9k\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.566036 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.566418 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.566440 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.566478 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.566719 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.566764 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.566871 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.566918 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.566954 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.566991 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.567023 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.567104 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.668863 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.668929 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zx9k\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-kube-api-access-6zx9k\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.668963 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.668989 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.669019 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.669069 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.669139 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.669193 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.669249 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.669280 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.669310 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.669343 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.669371 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.673731 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.674465 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.675451 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.678135 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.678798 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.678881 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.679890 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.680516 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.681553 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.681980 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.686476 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.687108 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.694263 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zx9k\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-kube-api-access-6zx9k\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.992260 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:37 crc kubenswrapper[4764]: I0309 14:00:37.550895 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s"] Mar 09 14:00:38 crc kubenswrapper[4764]: I0309 14:00:38.292258 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" event={"ID":"949d7512-b3be-4068-b05a-20589fbc2b52","Type":"ContainerStarted","Data":"fd45c11e7b2b7a0ad5cb655148b378fb1667ce6154e5c572398f320b6660f275"} Mar 09 14:00:38 crc kubenswrapper[4764]: I0309 14:00:38.559779 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:00:38 crc kubenswrapper[4764]: E0309 14:00:38.560713 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:00:39 crc kubenswrapper[4764]: I0309 14:00:39.304269 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" event={"ID":"949d7512-b3be-4068-b05a-20589fbc2b52","Type":"ContainerStarted","Data":"39847ae98cf6e663c2d6e6155c011decc4cdfda221bf659392fa3d752f138247"} Mar 09 14:00:39 crc kubenswrapper[4764]: I0309 14:00:39.336064 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" podStartSLOduration=2.833031782 podStartE2EDuration="3.336036723s" podCreationTimestamp="2026-03-09 14:00:36 +0000 UTC" firstStartedPulling="2026-03-09 14:00:37.565861226 +0000 UTC m=+2392.816033134" lastFinishedPulling="2026-03-09 14:00:38.068866177 +0000 UTC m=+2393.319038075" observedRunningTime="2026-03-09 14:00:39.329438376 +0000 UTC m=+2394.579610314" watchObservedRunningTime="2026-03-09 14:00:39.336036723 +0000 UTC m=+2394.586208641" Mar 09 14:00:51 crc kubenswrapper[4764]: I0309 14:00:51.560244 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:00:51 crc kubenswrapper[4764]: E0309 14:00:51.561392 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.156148 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29551081-wz9hv"] Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.160588 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.176337 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29551081-wz9hv"] Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.234783 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-config-data\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.234894 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-fernet-keys\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.234924 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-combined-ca-bundle\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.234962 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jprwh\" (UniqueName: \"kubernetes.io/projected/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-kube-api-access-jprwh\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.337833 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-config-data\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.337934 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-fernet-keys\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.337982 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-combined-ca-bundle\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.338024 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jprwh\" (UniqueName: \"kubernetes.io/projected/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-kube-api-access-jprwh\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.346578 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-config-data\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.347864 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-fernet-keys\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.356496 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-combined-ca-bundle\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.367551 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jprwh\" (UniqueName: \"kubernetes.io/projected/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-kube-api-access-jprwh\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.484022 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.997951 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29551081-wz9hv"] Mar 09 14:01:01 crc kubenswrapper[4764]: I0309 14:01:01.501609 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29551081-wz9hv" event={"ID":"6ec256c5-cf20-4b12-bb84-0f5d3e02460a","Type":"ContainerStarted","Data":"74125e64c2f42534639475226ea119db97fbe809fe5256ca1d1f86b0e608d59a"} Mar 09 14:01:01 crc kubenswrapper[4764]: I0309 14:01:01.501744 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29551081-wz9hv" event={"ID":"6ec256c5-cf20-4b12-bb84-0f5d3e02460a","Type":"ContainerStarted","Data":"ae6f384de670520ca16ff59918eb69c70be5c807796ecc67aeb342a5218c418b"} Mar 09 14:01:01 crc kubenswrapper[4764]: I0309 14:01:01.529134 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29551081-wz9hv" podStartSLOduration=1.529099712 podStartE2EDuration="1.529099712s" podCreationTimestamp="2026-03-09 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:01:01.523979795 +0000 UTC m=+2416.774151703" watchObservedRunningTime="2026-03-09 14:01:01.529099712 +0000 UTC m=+2416.779271640" Mar 09 14:01:03 crc kubenswrapper[4764]: I0309 14:01:03.519857 4764 generic.go:334] "Generic (PLEG): container finished" podID="6ec256c5-cf20-4b12-bb84-0f5d3e02460a" containerID="74125e64c2f42534639475226ea119db97fbe809fe5256ca1d1f86b0e608d59a" exitCode=0 Mar 09 14:01:03 crc kubenswrapper[4764]: I0309 14:01:03.519940 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29551081-wz9hv" event={"ID":"6ec256c5-cf20-4b12-bb84-0f5d3e02460a","Type":"ContainerDied","Data":"74125e64c2f42534639475226ea119db97fbe809fe5256ca1d1f86b0e608d59a"} Mar 09 14:01:04 crc kubenswrapper[4764]: I0309 14:01:04.878361 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:04 crc kubenswrapper[4764]: I0309 14:01:04.945303 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jprwh\" (UniqueName: \"kubernetes.io/projected/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-kube-api-access-jprwh\") pod \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " Mar 09 14:01:04 crc kubenswrapper[4764]: I0309 14:01:04.945397 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-combined-ca-bundle\") pod \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " Mar 09 14:01:04 crc kubenswrapper[4764]: I0309 14:01:04.945461 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-config-data\") pod \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " Mar 09 14:01:04 crc kubenswrapper[4764]: I0309 14:01:04.945618 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-fernet-keys\") pod \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " Mar 09 14:01:04 crc kubenswrapper[4764]: I0309 14:01:04.952696 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6ec256c5-cf20-4b12-bb84-0f5d3e02460a" (UID: "6ec256c5-cf20-4b12-bb84-0f5d3e02460a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:04 crc kubenswrapper[4764]: I0309 14:01:04.956988 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-kube-api-access-jprwh" (OuterVolumeSpecName: "kube-api-access-jprwh") pod "6ec256c5-cf20-4b12-bb84-0f5d3e02460a" (UID: "6ec256c5-cf20-4b12-bb84-0f5d3e02460a"). InnerVolumeSpecName "kube-api-access-jprwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:01:04 crc kubenswrapper[4764]: I0309 14:01:04.975834 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ec256c5-cf20-4b12-bb84-0f5d3e02460a" (UID: "6ec256c5-cf20-4b12-bb84-0f5d3e02460a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:04 crc kubenswrapper[4764]: I0309 14:01:04.999945 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-config-data" (OuterVolumeSpecName: "config-data") pod "6ec256c5-cf20-4b12-bb84-0f5d3e02460a" (UID: "6ec256c5-cf20-4b12-bb84-0f5d3e02460a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:05 crc kubenswrapper[4764]: I0309 14:01:05.047364 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:05 crc kubenswrapper[4764]: I0309 14:01:05.047746 4764 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:05 crc kubenswrapper[4764]: I0309 14:01:05.047811 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jprwh\" (UniqueName: \"kubernetes.io/projected/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-kube-api-access-jprwh\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:05 crc kubenswrapper[4764]: I0309 14:01:05.047883 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:05 crc kubenswrapper[4764]: I0309 14:01:05.542531 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29551081-wz9hv" event={"ID":"6ec256c5-cf20-4b12-bb84-0f5d3e02460a","Type":"ContainerDied","Data":"ae6f384de670520ca16ff59918eb69c70be5c807796ecc67aeb342a5218c418b"} Mar 09 14:01:05 crc kubenswrapper[4764]: I0309 14:01:05.542596 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae6f384de670520ca16ff59918eb69c70be5c807796ecc67aeb342a5218c418b" Mar 09 14:01:05 crc kubenswrapper[4764]: I0309 14:01:05.542603 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:05 crc kubenswrapper[4764]: I0309 14:01:05.580142 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:01:05 crc kubenswrapper[4764]: E0309 14:01:05.582955 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:01:06 crc kubenswrapper[4764]: I0309 14:01:06.555243 4764 generic.go:334] "Generic (PLEG): container finished" podID="949d7512-b3be-4068-b05a-20589fbc2b52" containerID="39847ae98cf6e663c2d6e6155c011decc4cdfda221bf659392fa3d752f138247" exitCode=0 Mar 09 14:01:06 crc kubenswrapper[4764]: I0309 14:01:06.555376 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" event={"ID":"949d7512-b3be-4068-b05a-20589fbc2b52","Type":"ContainerDied","Data":"39847ae98cf6e663c2d6e6155c011decc4cdfda221bf659392fa3d752f138247"} Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.004272 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.115384 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-ovn-default-certs-0\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.115456 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zx9k\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-kube-api-access-6zx9k\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.115522 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-libvirt-combined-ca-bundle\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.115562 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-inventory\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.115616 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ceph\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.115745 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-nova-combined-ca-bundle\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.115801 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-neutron-metadata-combined-ca-bundle\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.115869 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-repo-setup-combined-ca-bundle\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.115972 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-bootstrap-combined-ca-bundle\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.116005 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.116585 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ssh-key-openstack-edpm-ipam\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.116634 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ovn-combined-ca-bundle\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.116797 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.122878 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.123893 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.123934 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ceph" (OuterVolumeSpecName: "ceph") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.124073 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.124095 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.124685 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.124709 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.126862 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.129054 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.129864 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.132254 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-kube-api-access-6zx9k" (OuterVolumeSpecName: "kube-api-access-6zx9k") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "kube-api-access-6zx9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.144915 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.148037 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-inventory" (OuterVolumeSpecName: "inventory") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219751 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219799 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219819 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219834 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zx9k\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-kube-api-access-6zx9k\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219850 4764 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219864 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219876 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219889 4764 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219899 4764 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219910 4764 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219921 4764 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219932 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219943 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.575670 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" event={"ID":"949d7512-b3be-4068-b05a-20589fbc2b52","Type":"ContainerDied","Data":"fd45c11e7b2b7a0ad5cb655148b378fb1667ce6154e5c572398f320b6660f275"} Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.575729 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd45c11e7b2b7a0ad5cb655148b378fb1667ce6154e5c572398f320b6660f275" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.575869 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.730106 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp"] Mar 09 14:01:08 crc kubenswrapper[4764]: E0309 14:01:08.731301 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec256c5-cf20-4b12-bb84-0f5d3e02460a" containerName="keystone-cron" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.731326 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec256c5-cf20-4b12-bb84-0f5d3e02460a" containerName="keystone-cron" Mar 09 14:01:08 crc kubenswrapper[4764]: E0309 14:01:08.731375 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949d7512-b3be-4068-b05a-20589fbc2b52" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.731385 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="949d7512-b3be-4068-b05a-20589fbc2b52" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.731615 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec256c5-cf20-4b12-bb84-0f5d3e02460a" containerName="keystone-cron" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.731663 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="949d7512-b3be-4068-b05a-20589fbc2b52" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.732584 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.735758 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.735769 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.736083 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.736091 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.736479 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.751159 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp"] Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.829413 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnqm7\" (UniqueName: \"kubernetes.io/projected/e8ac27d6-e52e-4d38-b772-6ada493e746f-kube-api-access-fnqm7\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.829470 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.829520 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.829540 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.933117 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnqm7\" (UniqueName: \"kubernetes.io/projected/e8ac27d6-e52e-4d38-b772-6ada493e746f-kube-api-access-fnqm7\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.933219 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.933318 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.933351 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.938673 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.938742 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.939312 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.954246 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnqm7\" (UniqueName: \"kubernetes.io/projected/e8ac27d6-e52e-4d38-b772-6ada493e746f-kube-api-access-fnqm7\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:09 crc kubenswrapper[4764]: I0309 14:01:09.063534 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:09 crc kubenswrapper[4764]: I0309 14:01:09.628884 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp"] Mar 09 14:01:09 crc kubenswrapper[4764]: I0309 14:01:09.631967 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:01:10 crc kubenswrapper[4764]: I0309 14:01:10.597747 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" event={"ID":"e8ac27d6-e52e-4d38-b772-6ada493e746f","Type":"ContainerStarted","Data":"61754814ddd979027f1efaca27564db157bd11cd9aa5636858b9aa55143aff7c"} Mar 09 14:01:10 crc kubenswrapper[4764]: I0309 14:01:10.598285 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" event={"ID":"e8ac27d6-e52e-4d38-b772-6ada493e746f","Type":"ContainerStarted","Data":"85bb75c3b5612ff132d5c2c795945c0084bae3764401db098ff5980fb2ca16ec"} Mar 09 14:01:10 crc kubenswrapper[4764]: I0309 14:01:10.621097 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" podStartSLOduration=2.195519638 podStartE2EDuration="2.621069521s" podCreationTimestamp="2026-03-09 14:01:08 +0000 UTC" firstStartedPulling="2026-03-09 14:01:09.631604871 +0000 UTC m=+2424.881776779" lastFinishedPulling="2026-03-09 14:01:10.057154764 +0000 UTC m=+2425.307326662" observedRunningTime="2026-03-09 14:01:10.617769453 +0000 UTC m=+2425.867941381" watchObservedRunningTime="2026-03-09 14:01:10.621069521 +0000 UTC m=+2425.871241439" Mar 09 14:01:15 crc kubenswrapper[4764]: I0309 14:01:15.645162 4764 generic.go:334] "Generic (PLEG): container finished" podID="e8ac27d6-e52e-4d38-b772-6ada493e746f" containerID="61754814ddd979027f1efaca27564db157bd11cd9aa5636858b9aa55143aff7c" exitCode=0 Mar 09 14:01:15 crc kubenswrapper[4764]: I0309 14:01:15.645258 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" event={"ID":"e8ac27d6-e52e-4d38-b772-6ada493e746f","Type":"ContainerDied","Data":"61754814ddd979027f1efaca27564db157bd11cd9aa5636858b9aa55143aff7c"} Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.128562 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.313389 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnqm7\" (UniqueName: \"kubernetes.io/projected/e8ac27d6-e52e-4d38-b772-6ada493e746f-kube-api-access-fnqm7\") pod \"e8ac27d6-e52e-4d38-b772-6ada493e746f\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.313461 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ceph\") pod \"e8ac27d6-e52e-4d38-b772-6ada493e746f\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.313527 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ssh-key-openstack-edpm-ipam\") pod \"e8ac27d6-e52e-4d38-b772-6ada493e746f\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.313612 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-inventory\") pod \"e8ac27d6-e52e-4d38-b772-6ada493e746f\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.321935 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ceph" (OuterVolumeSpecName: "ceph") pod "e8ac27d6-e52e-4d38-b772-6ada493e746f" (UID: "e8ac27d6-e52e-4d38-b772-6ada493e746f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.327101 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8ac27d6-e52e-4d38-b772-6ada493e746f-kube-api-access-fnqm7" (OuterVolumeSpecName: "kube-api-access-fnqm7") pod "e8ac27d6-e52e-4d38-b772-6ada493e746f" (UID: "e8ac27d6-e52e-4d38-b772-6ada493e746f"). InnerVolumeSpecName "kube-api-access-fnqm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.347338 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e8ac27d6-e52e-4d38-b772-6ada493e746f" (UID: "e8ac27d6-e52e-4d38-b772-6ada493e746f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.349132 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-inventory" (OuterVolumeSpecName: "inventory") pod "e8ac27d6-e52e-4d38-b772-6ada493e746f" (UID: "e8ac27d6-e52e-4d38-b772-6ada493e746f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.416165 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnqm7\" (UniqueName: \"kubernetes.io/projected/e8ac27d6-e52e-4d38-b772-6ada493e746f-kube-api-access-fnqm7\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.416217 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.416233 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.416245 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.665477 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" event={"ID":"e8ac27d6-e52e-4d38-b772-6ada493e746f","Type":"ContainerDied","Data":"85bb75c3b5612ff132d5c2c795945c0084bae3764401db098ff5980fb2ca16ec"} Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.665529 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85bb75c3b5612ff132d5c2c795945c0084bae3764401db098ff5980fb2ca16ec" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.665537 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.770228 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8"] Mar 09 14:01:17 crc kubenswrapper[4764]: E0309 14:01:17.770795 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8ac27d6-e52e-4d38-b772-6ada493e746f" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.770818 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ac27d6-e52e-4d38-b772-6ada493e746f" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.771017 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8ac27d6-e52e-4d38-b772-6ada493e746f" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.771730 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.778686 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8"] Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.786401 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.786471 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.786402 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.786764 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.786860 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.786932 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.836991 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.837076 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.837160 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.837191 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.837227 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l422c\" (UniqueName: \"kubernetes.io/projected/ede2526d-593a-4258-9ec2-172270be638a-kube-api-access-l422c\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.837268 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ede2526d-593a-4258-9ec2-172270be638a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.939779 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.939874 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.939898 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.939938 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l422c\" (UniqueName: \"kubernetes.io/projected/ede2526d-593a-4258-9ec2-172270be638a-kube-api-access-l422c\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.939982 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ede2526d-593a-4258-9ec2-172270be638a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.940059 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.941639 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ede2526d-593a-4258-9ec2-172270be638a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.944439 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.945085 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.945150 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.945535 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.963912 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l422c\" (UniqueName: \"kubernetes.io/projected/ede2526d-593a-4258-9ec2-172270be638a-kube-api-access-l422c\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:18 crc kubenswrapper[4764]: I0309 14:01:18.110456 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:18 crc kubenswrapper[4764]: I0309 14:01:18.658288 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8"] Mar 09 14:01:18 crc kubenswrapper[4764]: I0309 14:01:18.676129 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" event={"ID":"ede2526d-593a-4258-9ec2-172270be638a","Type":"ContainerStarted","Data":"34e4e0e2e0553eae12a91e21b5fe948fe626bbec1e8d92ace1c4dceace8d957e"} Mar 09 14:01:19 crc kubenswrapper[4764]: I0309 14:01:19.688117 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" event={"ID":"ede2526d-593a-4258-9ec2-172270be638a","Type":"ContainerStarted","Data":"2dc5eadd32fcb5eb0fa35e7388b219c67069c6c548b4292850ea9586a4f91976"} Mar 09 14:01:20 crc kubenswrapper[4764]: I0309 14:01:20.560141 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:01:20 crc kubenswrapper[4764]: E0309 14:01:20.560503 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:01:31 crc kubenswrapper[4764]: I0309 14:01:31.560277 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:01:31 crc kubenswrapper[4764]: E0309 14:01:31.561471 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:01:33 crc kubenswrapper[4764]: I0309 14:01:33.906872 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-kl47c" podUID="9333a95c-85e4-4e7d-a142-ae2dd06b4146" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:01:43 crc kubenswrapper[4764]: I0309 14:01:43.560523 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:01:43 crc kubenswrapper[4764]: E0309 14:01:43.561633 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:01:55 crc kubenswrapper[4764]: I0309 14:01:55.571138 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:01:55 crc kubenswrapper[4764]: E0309 14:01:55.574008 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.155581 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" podStartSLOduration=42.751668104 podStartE2EDuration="43.155558589s" podCreationTimestamp="2026-03-09 14:01:17 +0000 UTC" firstStartedPulling="2026-03-09 14:01:18.666379334 +0000 UTC m=+2433.916551242" lastFinishedPulling="2026-03-09 14:01:19.070269819 +0000 UTC m=+2434.320441727" observedRunningTime="2026-03-09 14:01:19.71597605 +0000 UTC m=+2434.966147958" watchObservedRunningTime="2026-03-09 14:02:00.155558589 +0000 UTC m=+2475.405730497" Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.160673 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551082-9pffj"] Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.162294 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551082-9pffj" Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.166374 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.166600 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.166746 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.172415 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551082-9pffj"] Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.295358 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqd4d\" (UniqueName: \"kubernetes.io/projected/cbc0f639-3ece-4df6-bbaa-af1572005872-kube-api-access-cqd4d\") pod \"auto-csr-approver-29551082-9pffj\" (UID: \"cbc0f639-3ece-4df6-bbaa-af1572005872\") " pod="openshift-infra/auto-csr-approver-29551082-9pffj" Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.397884 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqd4d\" (UniqueName: \"kubernetes.io/projected/cbc0f639-3ece-4df6-bbaa-af1572005872-kube-api-access-cqd4d\") pod \"auto-csr-approver-29551082-9pffj\" (UID: \"cbc0f639-3ece-4df6-bbaa-af1572005872\") " pod="openshift-infra/auto-csr-approver-29551082-9pffj" Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.420575 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqd4d\" (UniqueName: \"kubernetes.io/projected/cbc0f639-3ece-4df6-bbaa-af1572005872-kube-api-access-cqd4d\") pod \"auto-csr-approver-29551082-9pffj\" (UID: \"cbc0f639-3ece-4df6-bbaa-af1572005872\") " pod="openshift-infra/auto-csr-approver-29551082-9pffj" Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.491076 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551082-9pffj" Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.955088 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551082-9pffj"] Mar 09 14:02:01 crc kubenswrapper[4764]: I0309 14:02:01.188170 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551082-9pffj" event={"ID":"cbc0f639-3ece-4df6-bbaa-af1572005872","Type":"ContainerStarted","Data":"9ba7e5964b950123ff567487976ceb2369a4ec0837aa4fde8a04e86b8df20854"} Mar 09 14:02:03 crc kubenswrapper[4764]: I0309 14:02:03.209693 4764 generic.go:334] "Generic (PLEG): container finished" podID="cbc0f639-3ece-4df6-bbaa-af1572005872" containerID="083b9151b9954604db8e0ffa63a89b0a3dc0a267db79fa28e9e68016aa4eee5c" exitCode=0 Mar 09 14:02:03 crc kubenswrapper[4764]: I0309 14:02:03.209763 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551082-9pffj" event={"ID":"cbc0f639-3ece-4df6-bbaa-af1572005872","Type":"ContainerDied","Data":"083b9151b9954604db8e0ffa63a89b0a3dc0a267db79fa28e9e68016aa4eee5c"} Mar 09 14:02:04 crc kubenswrapper[4764]: I0309 14:02:04.582356 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551082-9pffj" Mar 09 14:02:04 crc kubenswrapper[4764]: I0309 14:02:04.701504 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqd4d\" (UniqueName: \"kubernetes.io/projected/cbc0f639-3ece-4df6-bbaa-af1572005872-kube-api-access-cqd4d\") pod \"cbc0f639-3ece-4df6-bbaa-af1572005872\" (UID: \"cbc0f639-3ece-4df6-bbaa-af1572005872\") " Mar 09 14:02:04 crc kubenswrapper[4764]: I0309 14:02:04.710223 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc0f639-3ece-4df6-bbaa-af1572005872-kube-api-access-cqd4d" (OuterVolumeSpecName: "kube-api-access-cqd4d") pod "cbc0f639-3ece-4df6-bbaa-af1572005872" (UID: "cbc0f639-3ece-4df6-bbaa-af1572005872"). InnerVolumeSpecName "kube-api-access-cqd4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:02:04 crc kubenswrapper[4764]: I0309 14:02:04.805859 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqd4d\" (UniqueName: \"kubernetes.io/projected/cbc0f639-3ece-4df6-bbaa-af1572005872-kube-api-access-cqd4d\") on node \"crc\" DevicePath \"\"" Mar 09 14:02:05 crc kubenswrapper[4764]: I0309 14:02:05.232197 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551082-9pffj" event={"ID":"cbc0f639-3ece-4df6-bbaa-af1572005872","Type":"ContainerDied","Data":"9ba7e5964b950123ff567487976ceb2369a4ec0837aa4fde8a04e86b8df20854"} Mar 09 14:02:05 crc kubenswrapper[4764]: I0309 14:02:05.232609 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ba7e5964b950123ff567487976ceb2369a4ec0837aa4fde8a04e86b8df20854" Mar 09 14:02:05 crc kubenswrapper[4764]: I0309 14:02:05.232315 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551082-9pffj" Mar 09 14:02:05 crc kubenswrapper[4764]: I0309 14:02:05.671848 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551076-qrqw5"] Mar 09 14:02:05 crc kubenswrapper[4764]: I0309 14:02:05.679863 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551076-qrqw5"] Mar 09 14:02:06 crc kubenswrapper[4764]: I0309 14:02:06.561433 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:02:06 crc kubenswrapper[4764]: E0309 14:02:06.562329 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:02:07 crc kubenswrapper[4764]: I0309 14:02:07.571571 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a9d864e-dad7-4c7d-a639-d4042bb3339d" path="/var/lib/kubelet/pods/7a9d864e-dad7-4c7d-a639-d4042bb3339d/volumes" Mar 09 14:02:09 crc kubenswrapper[4764]: I0309 14:02:09.297006 4764 scope.go:117] "RemoveContainer" containerID="8f590f50cdda09a93b8757a7d03d71c1018f7d81bfe8e8784e29856175854a29" Mar 09 14:02:18 crc kubenswrapper[4764]: I0309 14:02:18.560774 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:02:18 crc kubenswrapper[4764]: E0309 14:02:18.561683 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:02:20 crc kubenswrapper[4764]: I0309 14:02:20.438705 4764 generic.go:334] "Generic (PLEG): container finished" podID="ede2526d-593a-4258-9ec2-172270be638a" containerID="2dc5eadd32fcb5eb0fa35e7388b219c67069c6c548b4292850ea9586a4f91976" exitCode=0 Mar 09 14:02:20 crc kubenswrapper[4764]: I0309 14:02:20.439081 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" event={"ID":"ede2526d-593a-4258-9ec2-172270be638a","Type":"ContainerDied","Data":"2dc5eadd32fcb5eb0fa35e7388b219c67069c6c548b4292850ea9586a4f91976"} Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.846224 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.905675 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ceph\") pod \"ede2526d-593a-4258-9ec2-172270be638a\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.905772 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ssh-key-openstack-edpm-ipam\") pod \"ede2526d-593a-4258-9ec2-172270be638a\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.905852 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ede2526d-593a-4258-9ec2-172270be638a-ovncontroller-config-0\") pod \"ede2526d-593a-4258-9ec2-172270be638a\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.905922 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-inventory\") pod \"ede2526d-593a-4258-9ec2-172270be638a\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.906065 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ovn-combined-ca-bundle\") pod \"ede2526d-593a-4258-9ec2-172270be638a\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.906148 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l422c\" (UniqueName: \"kubernetes.io/projected/ede2526d-593a-4258-9ec2-172270be638a-kube-api-access-l422c\") pod \"ede2526d-593a-4258-9ec2-172270be638a\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.927988 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "ede2526d-593a-4258-9ec2-172270be638a" (UID: "ede2526d-593a-4258-9ec2-172270be638a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.928054 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ceph" (OuterVolumeSpecName: "ceph") pod "ede2526d-593a-4258-9ec2-172270be638a" (UID: "ede2526d-593a-4258-9ec2-172270be638a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.928115 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ede2526d-593a-4258-9ec2-172270be638a-kube-api-access-l422c" (OuterVolumeSpecName: "kube-api-access-l422c") pod "ede2526d-593a-4258-9ec2-172270be638a" (UID: "ede2526d-593a-4258-9ec2-172270be638a"). InnerVolumeSpecName "kube-api-access-l422c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.932005 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ede2526d-593a-4258-9ec2-172270be638a-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "ede2526d-593a-4258-9ec2-172270be638a" (UID: "ede2526d-593a-4258-9ec2-172270be638a"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.941357 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ede2526d-593a-4258-9ec2-172270be638a" (UID: "ede2526d-593a-4258-9ec2-172270be638a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.942298 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-inventory" (OuterVolumeSpecName: "inventory") pod "ede2526d-593a-4258-9ec2-172270be638a" (UID: "ede2526d-593a-4258-9ec2-172270be638a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.009513 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.009759 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l422c\" (UniqueName: \"kubernetes.io/projected/ede2526d-593a-4258-9ec2-172270be638a-kube-api-access-l422c\") on node \"crc\" DevicePath \"\"" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.009865 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.009950 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.010028 4764 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ede2526d-593a-4258-9ec2-172270be638a-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.010097 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.461387 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" event={"ID":"ede2526d-593a-4258-9ec2-172270be638a","Type":"ContainerDied","Data":"34e4e0e2e0553eae12a91e21b5fe948fe626bbec1e8d92ace1c4dceace8d957e"} Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.461469 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34e4e0e2e0553eae12a91e21b5fe948fe626bbec1e8d92ace1c4dceace8d957e" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.461471 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.545771 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj"] Mar 09 14:02:22 crc kubenswrapper[4764]: E0309 14:02:22.546404 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc0f639-3ece-4df6-bbaa-af1572005872" containerName="oc" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.546618 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc0f639-3ece-4df6-bbaa-af1572005872" containerName="oc" Mar 09 14:02:22 crc kubenswrapper[4764]: E0309 14:02:22.546679 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ede2526d-593a-4258-9ec2-172270be638a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.546686 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ede2526d-593a-4258-9ec2-172270be638a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.546891 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbc0f639-3ece-4df6-bbaa-af1572005872" containerName="oc" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.546930 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ede2526d-593a-4258-9ec2-172270be638a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.547843 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.553054 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.556580 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj"] Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.559289 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.559307 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.559360 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.559419 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.561276 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.562444 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.625031 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.625297 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.625388 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phvr6\" (UniqueName: \"kubernetes.io/projected/8a38f1e2-ce88-47d9-883d-4d95c781d181-kube-api-access-phvr6\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.625467 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.625668 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.625757 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.625869 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.728045 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.728213 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.728301 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.728340 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phvr6\" (UniqueName: \"kubernetes.io/projected/8a38f1e2-ce88-47d9-883d-4d95c781d181-kube-api-access-phvr6\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.728378 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.728436 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.728469 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.734412 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.734412 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.734855 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.735164 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.736085 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.741176 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.751746 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phvr6\" (UniqueName: \"kubernetes.io/projected/8a38f1e2-ce88-47d9-883d-4d95c781d181-kube-api-access-phvr6\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.868205 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:23 crc kubenswrapper[4764]: I0309 14:02:23.482403 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj"] Mar 09 14:02:24 crc kubenswrapper[4764]: I0309 14:02:24.479951 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" event={"ID":"8a38f1e2-ce88-47d9-883d-4d95c781d181","Type":"ContainerStarted","Data":"d1f7816a6424fdde2f043a6817a44c6ba9d4601873f3fb65c8072848e7fb38fd"} Mar 09 14:02:24 crc kubenswrapper[4764]: I0309 14:02:24.480370 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" event={"ID":"8a38f1e2-ce88-47d9-883d-4d95c781d181","Type":"ContainerStarted","Data":"a5d86cbde00cd41578cfdae467292759f1c4ed53f38e6d7e158ec1fa5e1cbc67"} Mar 09 14:02:24 crc kubenswrapper[4764]: I0309 14:02:24.503498 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" podStartSLOduration=1.994006773 podStartE2EDuration="2.503473476s" podCreationTimestamp="2026-03-09 14:02:22 +0000 UTC" firstStartedPulling="2026-03-09 14:02:23.490474708 +0000 UTC m=+2498.740646616" lastFinishedPulling="2026-03-09 14:02:23.999941401 +0000 UTC m=+2499.250113319" observedRunningTime="2026-03-09 14:02:24.497805334 +0000 UTC m=+2499.747977252" watchObservedRunningTime="2026-03-09 14:02:24.503473476 +0000 UTC m=+2499.753645384" Mar 09 14:02:33 crc kubenswrapper[4764]: I0309 14:02:33.559530 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:02:33 crc kubenswrapper[4764]: E0309 14:02:33.560322 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:02:44 crc kubenswrapper[4764]: I0309 14:02:44.559884 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:02:44 crc kubenswrapper[4764]: E0309 14:02:44.560801 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:02:55 crc kubenswrapper[4764]: I0309 14:02:55.567236 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:02:55 crc kubenswrapper[4764]: E0309 14:02:55.568324 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:03:08 crc kubenswrapper[4764]: I0309 14:03:08.559897 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:03:08 crc kubenswrapper[4764]: E0309 14:03:08.560824 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:03:14 crc kubenswrapper[4764]: I0309 14:03:14.943136 4764 generic.go:334] "Generic (PLEG): container finished" podID="8a38f1e2-ce88-47d9-883d-4d95c781d181" containerID="d1f7816a6424fdde2f043a6817a44c6ba9d4601873f3fb65c8072848e7fb38fd" exitCode=0 Mar 09 14:03:14 crc kubenswrapper[4764]: I0309 14:03:14.943197 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" event={"ID":"8a38f1e2-ce88-47d9-883d-4d95c781d181","Type":"ContainerDied","Data":"d1f7816a6424fdde2f043a6817a44c6ba9d4601873f3fb65c8072848e7fb38fd"} Mar 09 14:03:16 crc kubenswrapper[4764]: I0309 14:03:16.971158 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" event={"ID":"8a38f1e2-ce88-47d9-883d-4d95c781d181","Type":"ContainerDied","Data":"a5d86cbde00cd41578cfdae467292759f1c4ed53f38e6d7e158ec1fa5e1cbc67"} Mar 09 14:03:16 crc kubenswrapper[4764]: I0309 14:03:16.971609 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5d86cbde00cd41578cfdae467292759f1c4ed53f38e6d7e158ec1fa5e1cbc67" Mar 09 14:03:16 crc kubenswrapper[4764]: I0309 14:03:16.994825 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.015563 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phvr6\" (UniqueName: \"kubernetes.io/projected/8a38f1e2-ce88-47d9-883d-4d95c781d181-kube-api-access-phvr6\") pod \"8a38f1e2-ce88-47d9-883d-4d95c781d181\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.015700 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-nova-metadata-neutron-config-0\") pod \"8a38f1e2-ce88-47d9-883d-4d95c781d181\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.015719 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ssh-key-openstack-edpm-ipam\") pod \"8a38f1e2-ce88-47d9-883d-4d95c781d181\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.015742 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-inventory\") pod \"8a38f1e2-ce88-47d9-883d-4d95c781d181\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.015764 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-metadata-combined-ca-bundle\") pod \"8a38f1e2-ce88-47d9-883d-4d95c781d181\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.015786 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ceph\") pod \"8a38f1e2-ce88-47d9-883d-4d95c781d181\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.015902 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-ovn-metadata-agent-neutron-config-0\") pod \"8a38f1e2-ce88-47d9-883d-4d95c781d181\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.023595 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a38f1e2-ce88-47d9-883d-4d95c781d181-kube-api-access-phvr6" (OuterVolumeSpecName: "kube-api-access-phvr6") pod "8a38f1e2-ce88-47d9-883d-4d95c781d181" (UID: "8a38f1e2-ce88-47d9-883d-4d95c781d181"). InnerVolumeSpecName "kube-api-access-phvr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.024452 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "8a38f1e2-ce88-47d9-883d-4d95c781d181" (UID: "8a38f1e2-ce88-47d9-883d-4d95c781d181"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.028043 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ceph" (OuterVolumeSpecName: "ceph") pod "8a38f1e2-ce88-47d9-883d-4d95c781d181" (UID: "8a38f1e2-ce88-47d9-883d-4d95c781d181"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.051963 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "8a38f1e2-ce88-47d9-883d-4d95c781d181" (UID: "8a38f1e2-ce88-47d9-883d-4d95c781d181"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.063556 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "8a38f1e2-ce88-47d9-883d-4d95c781d181" (UID: "8a38f1e2-ce88-47d9-883d-4d95c781d181"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.076846 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-inventory" (OuterVolumeSpecName: "inventory") pod "8a38f1e2-ce88-47d9-883d-4d95c781d181" (UID: "8a38f1e2-ce88-47d9-883d-4d95c781d181"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.087266 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8a38f1e2-ce88-47d9-883d-4d95c781d181" (UID: "8a38f1e2-ce88-47d9-883d-4d95c781d181"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.119302 4764 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.119344 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.119358 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.119369 4764 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.119381 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.119395 4764 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.119406 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phvr6\" (UniqueName: \"kubernetes.io/projected/8a38f1e2-ce88-47d9-883d-4d95c781d181-kube-api-access-phvr6\") on node \"crc\" DevicePath \"\"" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.979086 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.146812 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98"] Mar 09 14:03:18 crc kubenswrapper[4764]: E0309 14:03:18.147477 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a38f1e2-ce88-47d9-883d-4d95c781d181" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.147518 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a38f1e2-ce88-47d9-883d-4d95c781d181" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.147845 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a38f1e2-ce88-47d9-883d-4d95c781d181" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.148965 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.152293 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.152940 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.153412 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.153584 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.153720 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.153723 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.161886 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98"] Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.243752 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.243814 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.243838 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.244221 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmjg6\" (UniqueName: \"kubernetes.io/projected/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-kube-api-access-pmjg6\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.244405 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.244470 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.345544 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.345616 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.345667 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.345788 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmjg6\" (UniqueName: \"kubernetes.io/projected/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-kube-api-access-pmjg6\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.345845 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.345879 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.353033 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.353165 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.353773 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.353806 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.364412 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.368809 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmjg6\" (UniqueName: \"kubernetes.io/projected/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-kube-api-access-pmjg6\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.515075 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:19 crc kubenswrapper[4764]: I0309 14:03:19.089286 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98"] Mar 09 14:03:20 crc kubenswrapper[4764]: I0309 14:03:20.002589 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" event={"ID":"0a9ed7f5-c296-41ac-ae0d-5845c66a385a","Type":"ContainerStarted","Data":"a1d62db2728aab12ddd10813940dde51882f73fd4e8503f3f1ba3d93d56bff30"} Mar 09 14:03:20 crc kubenswrapper[4764]: I0309 14:03:20.003462 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" event={"ID":"0a9ed7f5-c296-41ac-ae0d-5845c66a385a","Type":"ContainerStarted","Data":"548509aa0f1e8c3468f071ff1b05af0aabf6c1afd7c27047c4f22b0be5b05951"} Mar 09 14:03:20 crc kubenswrapper[4764]: I0309 14:03:20.027813 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" podStartSLOduration=1.58397101 podStartE2EDuration="2.027787671s" podCreationTimestamp="2026-03-09 14:03:18 +0000 UTC" firstStartedPulling="2026-03-09 14:03:19.094949593 +0000 UTC m=+2554.345121491" lastFinishedPulling="2026-03-09 14:03:19.538766244 +0000 UTC m=+2554.788938152" observedRunningTime="2026-03-09 14:03:20.020955129 +0000 UTC m=+2555.271127037" watchObservedRunningTime="2026-03-09 14:03:20.027787671 +0000 UTC m=+2555.277959579" Mar 09 14:03:21 crc kubenswrapper[4764]: I0309 14:03:21.560143 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:03:21 crc kubenswrapper[4764]: E0309 14:03:21.560444 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.210611 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jx29x"] Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.214866 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.228158 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jx29x"] Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.283978 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9twzf\" (UniqueName: \"kubernetes.io/projected/6154590f-34aa-4248-a339-14cb0d11da17-kube-api-access-9twzf\") pod \"certified-operators-jx29x\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.284151 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-catalog-content\") pod \"certified-operators-jx29x\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.284196 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-utilities\") pod \"certified-operators-jx29x\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.385815 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9twzf\" (UniqueName: \"kubernetes.io/projected/6154590f-34aa-4248-a339-14cb0d11da17-kube-api-access-9twzf\") pod \"certified-operators-jx29x\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.385894 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-catalog-content\") pod \"certified-operators-jx29x\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.385916 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-utilities\") pod \"certified-operators-jx29x\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.386469 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-utilities\") pod \"certified-operators-jx29x\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.386573 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-catalog-content\") pod \"certified-operators-jx29x\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.411452 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9twzf\" (UniqueName: \"kubernetes.io/projected/6154590f-34aa-4248-a339-14cb0d11da17-kube-api-access-9twzf\") pod \"certified-operators-jx29x\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.546992 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:34 crc kubenswrapper[4764]: I0309 14:03:34.202824 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jx29x"] Mar 09 14:03:34 crc kubenswrapper[4764]: I0309 14:03:34.559621 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:03:34 crc kubenswrapper[4764]: E0309 14:03:34.560090 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:03:35 crc kubenswrapper[4764]: I0309 14:03:35.145198 4764 generic.go:334] "Generic (PLEG): container finished" podID="6154590f-34aa-4248-a339-14cb0d11da17" containerID="a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190" exitCode=0 Mar 09 14:03:35 crc kubenswrapper[4764]: I0309 14:03:35.145303 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jx29x" event={"ID":"6154590f-34aa-4248-a339-14cb0d11da17","Type":"ContainerDied","Data":"a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190"} Mar 09 14:03:35 crc kubenswrapper[4764]: I0309 14:03:35.145828 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jx29x" event={"ID":"6154590f-34aa-4248-a339-14cb0d11da17","Type":"ContainerStarted","Data":"a8bed4ae015f91a1af51881a89b70eb39989a4c366d3d28f458614db53116da0"} Mar 09 14:03:37 crc kubenswrapper[4764]: I0309 14:03:37.164593 4764 generic.go:334] "Generic (PLEG): container finished" podID="6154590f-34aa-4248-a339-14cb0d11da17" containerID="cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6" exitCode=0 Mar 09 14:03:37 crc kubenswrapper[4764]: I0309 14:03:37.164698 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jx29x" event={"ID":"6154590f-34aa-4248-a339-14cb0d11da17","Type":"ContainerDied","Data":"cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6"} Mar 09 14:03:38 crc kubenswrapper[4764]: I0309 14:03:38.178337 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jx29x" event={"ID":"6154590f-34aa-4248-a339-14cb0d11da17","Type":"ContainerStarted","Data":"5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c"} Mar 09 14:03:38 crc kubenswrapper[4764]: I0309 14:03:38.206553 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jx29x" podStartSLOduration=2.7360647670000002 podStartE2EDuration="5.206524592s" podCreationTimestamp="2026-03-09 14:03:33 +0000 UTC" firstStartedPulling="2026-03-09 14:03:35.148504878 +0000 UTC m=+2570.398676796" lastFinishedPulling="2026-03-09 14:03:37.618964713 +0000 UTC m=+2572.869136621" observedRunningTime="2026-03-09 14:03:38.202820983 +0000 UTC m=+2573.452992911" watchObservedRunningTime="2026-03-09 14:03:38.206524592 +0000 UTC m=+2573.456696500" Mar 09 14:03:43 crc kubenswrapper[4764]: I0309 14:03:43.547744 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:43 crc kubenswrapper[4764]: I0309 14:03:43.548666 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:43 crc kubenswrapper[4764]: I0309 14:03:43.601240 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:44 crc kubenswrapper[4764]: I0309 14:03:44.282700 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:44 crc kubenswrapper[4764]: I0309 14:03:44.348758 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jx29x"] Mar 09 14:03:45 crc kubenswrapper[4764]: I0309 14:03:45.572104 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:03:45 crc kubenswrapper[4764]: E0309 14:03:45.573162 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:03:46 crc kubenswrapper[4764]: I0309 14:03:46.247069 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jx29x" podUID="6154590f-34aa-4248-a339-14cb0d11da17" containerName="registry-server" containerID="cri-o://5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c" gracePeriod=2 Mar 09 14:03:46 crc kubenswrapper[4764]: I0309 14:03:46.712623 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:46 crc kubenswrapper[4764]: I0309 14:03:46.883368 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9twzf\" (UniqueName: \"kubernetes.io/projected/6154590f-34aa-4248-a339-14cb0d11da17-kube-api-access-9twzf\") pod \"6154590f-34aa-4248-a339-14cb0d11da17\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " Mar 09 14:03:46 crc kubenswrapper[4764]: I0309 14:03:46.883434 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-utilities\") pod \"6154590f-34aa-4248-a339-14cb0d11da17\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " Mar 09 14:03:46 crc kubenswrapper[4764]: I0309 14:03:46.883469 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-catalog-content\") pod \"6154590f-34aa-4248-a339-14cb0d11da17\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " Mar 09 14:03:46 crc kubenswrapper[4764]: I0309 14:03:46.884800 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-utilities" (OuterVolumeSpecName: "utilities") pod "6154590f-34aa-4248-a339-14cb0d11da17" (UID: "6154590f-34aa-4248-a339-14cb0d11da17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:03:46 crc kubenswrapper[4764]: I0309 14:03:46.891044 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6154590f-34aa-4248-a339-14cb0d11da17-kube-api-access-9twzf" (OuterVolumeSpecName: "kube-api-access-9twzf") pod "6154590f-34aa-4248-a339-14cb0d11da17" (UID: "6154590f-34aa-4248-a339-14cb0d11da17"). InnerVolumeSpecName "kube-api-access-9twzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:03:46 crc kubenswrapper[4764]: I0309 14:03:46.988290 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:03:46 crc kubenswrapper[4764]: I0309 14:03:46.988363 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9twzf\" (UniqueName: \"kubernetes.io/projected/6154590f-34aa-4248-a339-14cb0d11da17-kube-api-access-9twzf\") on node \"crc\" DevicePath \"\"" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.263280 4764 generic.go:334] "Generic (PLEG): container finished" podID="6154590f-34aa-4248-a339-14cb0d11da17" containerID="5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c" exitCode=0 Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.263357 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jx29x" event={"ID":"6154590f-34aa-4248-a339-14cb0d11da17","Type":"ContainerDied","Data":"5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c"} Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.263398 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.263407 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jx29x" event={"ID":"6154590f-34aa-4248-a339-14cb0d11da17","Type":"ContainerDied","Data":"a8bed4ae015f91a1af51881a89b70eb39989a4c366d3d28f458614db53116da0"} Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.263442 4764 scope.go:117] "RemoveContainer" containerID="5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.285012 4764 scope.go:117] "RemoveContainer" containerID="cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.316601 4764 scope.go:117] "RemoveContainer" containerID="a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.358357 4764 scope.go:117] "RemoveContainer" containerID="5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c" Mar 09 14:03:47 crc kubenswrapper[4764]: E0309 14:03:47.359159 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c\": container with ID starting with 5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c not found: ID does not exist" containerID="5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.359225 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c"} err="failed to get container status \"5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c\": rpc error: code = NotFound desc = could not find container \"5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c\": container with ID starting with 5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c not found: ID does not exist" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.359287 4764 scope.go:117] "RemoveContainer" containerID="cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6" Mar 09 14:03:47 crc kubenswrapper[4764]: E0309 14:03:47.359714 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6\": container with ID starting with cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6 not found: ID does not exist" containerID="cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.359756 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6"} err="failed to get container status \"cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6\": rpc error: code = NotFound desc = could not find container \"cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6\": container with ID starting with cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6 not found: ID does not exist" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.359785 4764 scope.go:117] "RemoveContainer" containerID="a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190" Mar 09 14:03:47 crc kubenswrapper[4764]: E0309 14:03:47.360090 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190\": container with ID starting with a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190 not found: ID does not exist" containerID="a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.360129 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190"} err="failed to get container status \"a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190\": rpc error: code = NotFound desc = could not find container \"a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190\": container with ID starting with a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190 not found: ID does not exist" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.385922 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6154590f-34aa-4248-a339-14cb0d11da17" (UID: "6154590f-34aa-4248-a339-14cb0d11da17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.403722 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.606925 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jx29x"] Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.615693 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jx29x"] Mar 09 14:03:49 crc kubenswrapper[4764]: I0309 14:03:49.571402 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6154590f-34aa-4248-a339-14cb0d11da17" path="/var/lib/kubelet/pods/6154590f-34aa-4248-a339-14cb0d11da17/volumes" Mar 09 14:03:57 crc kubenswrapper[4764]: I0309 14:03:57.560519 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:03:57 crc kubenswrapper[4764]: E0309 14:03:57.561720 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.147405 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551084-pchq7"] Mar 09 14:04:00 crc kubenswrapper[4764]: E0309 14:04:00.148606 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6154590f-34aa-4248-a339-14cb0d11da17" containerName="extract-utilities" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.148622 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6154590f-34aa-4248-a339-14cb0d11da17" containerName="extract-utilities" Mar 09 14:04:00 crc kubenswrapper[4764]: E0309 14:04:00.148633 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6154590f-34aa-4248-a339-14cb0d11da17" containerName="extract-content" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.148640 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6154590f-34aa-4248-a339-14cb0d11da17" containerName="extract-content" Mar 09 14:04:00 crc kubenswrapper[4764]: E0309 14:04:00.148691 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6154590f-34aa-4248-a339-14cb0d11da17" containerName="registry-server" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.148699 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6154590f-34aa-4248-a339-14cb0d11da17" containerName="registry-server" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.148870 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6154590f-34aa-4248-a339-14cb0d11da17" containerName="registry-server" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.149640 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551084-pchq7" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.152885 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.152911 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.152885 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.159439 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551084-pchq7"] Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.291068 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zss62\" (UniqueName: \"kubernetes.io/projected/91d1d723-d4e8-40d8-9d17-3dfee51e7aef-kube-api-access-zss62\") pod \"auto-csr-approver-29551084-pchq7\" (UID: \"91d1d723-d4e8-40d8-9d17-3dfee51e7aef\") " pod="openshift-infra/auto-csr-approver-29551084-pchq7" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.392795 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zss62\" (UniqueName: \"kubernetes.io/projected/91d1d723-d4e8-40d8-9d17-3dfee51e7aef-kube-api-access-zss62\") pod \"auto-csr-approver-29551084-pchq7\" (UID: \"91d1d723-d4e8-40d8-9d17-3dfee51e7aef\") " pod="openshift-infra/auto-csr-approver-29551084-pchq7" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.414145 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zss62\" (UniqueName: \"kubernetes.io/projected/91d1d723-d4e8-40d8-9d17-3dfee51e7aef-kube-api-access-zss62\") pod \"auto-csr-approver-29551084-pchq7\" (UID: \"91d1d723-d4e8-40d8-9d17-3dfee51e7aef\") " pod="openshift-infra/auto-csr-approver-29551084-pchq7" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.480332 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551084-pchq7" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.744348 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551084-pchq7"] Mar 09 14:04:01 crc kubenswrapper[4764]: I0309 14:04:01.400182 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551084-pchq7" event={"ID":"91d1d723-d4e8-40d8-9d17-3dfee51e7aef","Type":"ContainerStarted","Data":"72484cba728ec8aec28a5b633ccfe8a2f5e5486a0d2c3482ac695e52f4c532be"} Mar 09 14:04:02 crc kubenswrapper[4764]: I0309 14:04:02.426218 4764 generic.go:334] "Generic (PLEG): container finished" podID="91d1d723-d4e8-40d8-9d17-3dfee51e7aef" containerID="6e6b381c4b8297d66803da8d662836720642fff63afbcf8243cdcd4157213d11" exitCode=0 Mar 09 14:04:02 crc kubenswrapper[4764]: I0309 14:04:02.426297 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551084-pchq7" event={"ID":"91d1d723-d4e8-40d8-9d17-3dfee51e7aef","Type":"ContainerDied","Data":"6e6b381c4b8297d66803da8d662836720642fff63afbcf8243cdcd4157213d11"} Mar 09 14:04:03 crc kubenswrapper[4764]: I0309 14:04:03.812951 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551084-pchq7" Mar 09 14:04:03 crc kubenswrapper[4764]: I0309 14:04:03.975698 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zss62\" (UniqueName: \"kubernetes.io/projected/91d1d723-d4e8-40d8-9d17-3dfee51e7aef-kube-api-access-zss62\") pod \"91d1d723-d4e8-40d8-9d17-3dfee51e7aef\" (UID: \"91d1d723-d4e8-40d8-9d17-3dfee51e7aef\") " Mar 09 14:04:03 crc kubenswrapper[4764]: I0309 14:04:03.982347 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91d1d723-d4e8-40d8-9d17-3dfee51e7aef-kube-api-access-zss62" (OuterVolumeSpecName: "kube-api-access-zss62") pod "91d1d723-d4e8-40d8-9d17-3dfee51e7aef" (UID: "91d1d723-d4e8-40d8-9d17-3dfee51e7aef"). InnerVolumeSpecName "kube-api-access-zss62". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:04 crc kubenswrapper[4764]: I0309 14:04:04.078893 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zss62\" (UniqueName: \"kubernetes.io/projected/91d1d723-d4e8-40d8-9d17-3dfee51e7aef-kube-api-access-zss62\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:04 crc kubenswrapper[4764]: I0309 14:04:04.448630 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551084-pchq7" event={"ID":"91d1d723-d4e8-40d8-9d17-3dfee51e7aef","Type":"ContainerDied","Data":"72484cba728ec8aec28a5b633ccfe8a2f5e5486a0d2c3482ac695e52f4c532be"} Mar 09 14:04:04 crc kubenswrapper[4764]: I0309 14:04:04.448705 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72484cba728ec8aec28a5b633ccfe8a2f5e5486a0d2c3482ac695e52f4c532be" Mar 09 14:04:04 crc kubenswrapper[4764]: I0309 14:04:04.448743 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551084-pchq7" Mar 09 14:04:04 crc kubenswrapper[4764]: I0309 14:04:04.903863 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551078-z7ms2"] Mar 09 14:04:04 crc kubenswrapper[4764]: I0309 14:04:04.916744 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551078-z7ms2"] Mar 09 14:04:05 crc kubenswrapper[4764]: I0309 14:04:05.571930 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9" path="/var/lib/kubelet/pods/f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9/volumes" Mar 09 14:04:09 crc kubenswrapper[4764]: I0309 14:04:09.392327 4764 scope.go:117] "RemoveContainer" containerID="e98df59174ca147f240e577b9eb7747712c4ce30b3cb22cafa3c795a0cc708fe" Mar 09 14:04:11 crc kubenswrapper[4764]: I0309 14:04:11.561282 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:04:12 crc kubenswrapper[4764]: I0309 14:04:12.540920 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"63776e57dad14b6a74a94432519b0b437cfc2082448e28e0a32706fa439569a0"} Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.152601 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551086-dn2gl"] Mar 09 14:06:00 crc kubenswrapper[4764]: E0309 14:06:00.154308 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d1d723-d4e8-40d8-9d17-3dfee51e7aef" containerName="oc" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.154328 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d1d723-d4e8-40d8-9d17-3dfee51e7aef" containerName="oc" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.154574 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="91d1d723-d4e8-40d8-9d17-3dfee51e7aef" containerName="oc" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.155548 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551086-dn2gl" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.158423 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.158934 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.159702 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.172939 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551086-dn2gl"] Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.288329 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p98kr\" (UniqueName: \"kubernetes.io/projected/55c3951f-6e8b-46f4-9332-9c5d658862e4-kube-api-access-p98kr\") pod \"auto-csr-approver-29551086-dn2gl\" (UID: \"55c3951f-6e8b-46f4-9332-9c5d658862e4\") " pod="openshift-infra/auto-csr-approver-29551086-dn2gl" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.391099 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p98kr\" (UniqueName: \"kubernetes.io/projected/55c3951f-6e8b-46f4-9332-9c5d658862e4-kube-api-access-p98kr\") pod \"auto-csr-approver-29551086-dn2gl\" (UID: \"55c3951f-6e8b-46f4-9332-9c5d658862e4\") " pod="openshift-infra/auto-csr-approver-29551086-dn2gl" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.414222 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p98kr\" (UniqueName: \"kubernetes.io/projected/55c3951f-6e8b-46f4-9332-9c5d658862e4-kube-api-access-p98kr\") pod \"auto-csr-approver-29551086-dn2gl\" (UID: \"55c3951f-6e8b-46f4-9332-9c5d658862e4\") " pod="openshift-infra/auto-csr-approver-29551086-dn2gl" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.480943 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551086-dn2gl" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.992279 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551086-dn2gl"] Mar 09 14:06:01 crc kubenswrapper[4764]: I0309 14:06:01.634082 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551086-dn2gl" event={"ID":"55c3951f-6e8b-46f4-9332-9c5d658862e4","Type":"ContainerStarted","Data":"fb1ca94ed31e845ce1a0283cdd181ddc6c87515415c8bff16919dbdf8df21971"} Mar 09 14:06:02 crc kubenswrapper[4764]: I0309 14:06:02.645989 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551086-dn2gl" event={"ID":"55c3951f-6e8b-46f4-9332-9c5d658862e4","Type":"ContainerStarted","Data":"4dfa06f7a78f4ddaf54d7700f7fde17ae280a4e75103177bb66105376bf4e6a2"} Mar 09 14:06:02 crc kubenswrapper[4764]: I0309 14:06:02.672121 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551086-dn2gl" podStartSLOduration=1.368834846 podStartE2EDuration="2.672098988s" podCreationTimestamp="2026-03-09 14:06:00 +0000 UTC" firstStartedPulling="2026-03-09 14:06:00.997863236 +0000 UTC m=+2716.248035144" lastFinishedPulling="2026-03-09 14:06:02.301127378 +0000 UTC m=+2717.551299286" observedRunningTime="2026-03-09 14:06:02.661263999 +0000 UTC m=+2717.911435907" watchObservedRunningTime="2026-03-09 14:06:02.672098988 +0000 UTC m=+2717.922270896" Mar 09 14:06:03 crc kubenswrapper[4764]: I0309 14:06:03.671208 4764 generic.go:334] "Generic (PLEG): container finished" podID="55c3951f-6e8b-46f4-9332-9c5d658862e4" containerID="4dfa06f7a78f4ddaf54d7700f7fde17ae280a4e75103177bb66105376bf4e6a2" exitCode=0 Mar 09 14:06:03 crc kubenswrapper[4764]: I0309 14:06:03.671306 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551086-dn2gl" event={"ID":"55c3951f-6e8b-46f4-9332-9c5d658862e4","Type":"ContainerDied","Data":"4dfa06f7a78f4ddaf54d7700f7fde17ae280a4e75103177bb66105376bf4e6a2"} Mar 09 14:06:05 crc kubenswrapper[4764]: I0309 14:06:05.010036 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551086-dn2gl" Mar 09 14:06:05 crc kubenswrapper[4764]: I0309 14:06:05.126080 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p98kr\" (UniqueName: \"kubernetes.io/projected/55c3951f-6e8b-46f4-9332-9c5d658862e4-kube-api-access-p98kr\") pod \"55c3951f-6e8b-46f4-9332-9c5d658862e4\" (UID: \"55c3951f-6e8b-46f4-9332-9c5d658862e4\") " Mar 09 14:06:05 crc kubenswrapper[4764]: I0309 14:06:05.133881 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55c3951f-6e8b-46f4-9332-9c5d658862e4-kube-api-access-p98kr" (OuterVolumeSpecName: "kube-api-access-p98kr") pod "55c3951f-6e8b-46f4-9332-9c5d658862e4" (UID: "55c3951f-6e8b-46f4-9332-9c5d658862e4"). InnerVolumeSpecName "kube-api-access-p98kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:06:05 crc kubenswrapper[4764]: I0309 14:06:05.228995 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p98kr\" (UniqueName: \"kubernetes.io/projected/55c3951f-6e8b-46f4-9332-9c5d658862e4-kube-api-access-p98kr\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:05 crc kubenswrapper[4764]: I0309 14:06:05.689486 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551086-dn2gl" event={"ID":"55c3951f-6e8b-46f4-9332-9c5d658862e4","Type":"ContainerDied","Data":"fb1ca94ed31e845ce1a0283cdd181ddc6c87515415c8bff16919dbdf8df21971"} Mar 09 14:06:05 crc kubenswrapper[4764]: I0309 14:06:05.689538 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb1ca94ed31e845ce1a0283cdd181ddc6c87515415c8bff16919dbdf8df21971" Mar 09 14:06:05 crc kubenswrapper[4764]: I0309 14:06:05.689564 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551086-dn2gl" Mar 09 14:06:05 crc kubenswrapper[4764]: I0309 14:06:05.794318 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551080-svz2w"] Mar 09 14:06:05 crc kubenswrapper[4764]: I0309 14:06:05.802467 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551080-svz2w"] Mar 09 14:06:07 crc kubenswrapper[4764]: I0309 14:06:07.571483 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6eebc0e-7e89-4489-b808-7eebf0e54dca" path="/var/lib/kubelet/pods/f6eebc0e-7e89-4489-b808-7eebf0e54dca/volumes" Mar 09 14:06:09 crc kubenswrapper[4764]: I0309 14:06:09.529761 4764 scope.go:117] "RemoveContainer" containerID="55591ff62c6b7e1f8299cbaa3c157dfe4ead0808f086d1b63edc0dcadd27bf4a" Mar 09 14:06:28 crc kubenswrapper[4764]: I0309 14:06:28.370299 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:06:28 crc kubenswrapper[4764]: I0309 14:06:28.370946 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:06:58 crc kubenswrapper[4764]: I0309 14:06:58.371019 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:06:58 crc kubenswrapper[4764]: I0309 14:06:58.371915 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:07:01 crc kubenswrapper[4764]: I0309 14:07:01.251455 4764 generic.go:334] "Generic (PLEG): container finished" podID="0a9ed7f5-c296-41ac-ae0d-5845c66a385a" containerID="a1d62db2728aab12ddd10813940dde51882f73fd4e8503f3f1ba3d93d56bff30" exitCode=0 Mar 09 14:07:01 crc kubenswrapper[4764]: I0309 14:07:01.251592 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" event={"ID":"0a9ed7f5-c296-41ac-ae0d-5845c66a385a","Type":"ContainerDied","Data":"a1d62db2728aab12ddd10813940dde51882f73fd4e8503f3f1ba3d93d56bff30"} Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.671047 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.859707 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-inventory\") pod \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.859774 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-secret-0\") pod \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.859823 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ceph\") pod \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.859997 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-combined-ca-bundle\") pod \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.860051 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmjg6\" (UniqueName: \"kubernetes.io/projected/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-kube-api-access-pmjg6\") pod \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.860078 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ssh-key-openstack-edpm-ipam\") pod \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.867825 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0a9ed7f5-c296-41ac-ae0d-5845c66a385a" (UID: "0a9ed7f5-c296-41ac-ae0d-5845c66a385a"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.869167 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ceph" (OuterVolumeSpecName: "ceph") pod "0a9ed7f5-c296-41ac-ae0d-5845c66a385a" (UID: "0a9ed7f5-c296-41ac-ae0d-5845c66a385a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.869970 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-kube-api-access-pmjg6" (OuterVolumeSpecName: "kube-api-access-pmjg6") pod "0a9ed7f5-c296-41ac-ae0d-5845c66a385a" (UID: "0a9ed7f5-c296-41ac-ae0d-5845c66a385a"). InnerVolumeSpecName "kube-api-access-pmjg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.892365 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0a9ed7f5-c296-41ac-ae0d-5845c66a385a" (UID: "0a9ed7f5-c296-41ac-ae0d-5845c66a385a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.894623 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-inventory" (OuterVolumeSpecName: "inventory") pod "0a9ed7f5-c296-41ac-ae0d-5845c66a385a" (UID: "0a9ed7f5-c296-41ac-ae0d-5845c66a385a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.901623 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "0a9ed7f5-c296-41ac-ae0d-5845c66a385a" (UID: "0a9ed7f5-c296-41ac-ae0d-5845c66a385a"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.962582 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.962628 4764 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.962660 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.962671 4764 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.962682 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.962693 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmjg6\" (UniqueName: \"kubernetes.io/projected/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-kube-api-access-pmjg6\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.271348 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" event={"ID":"0a9ed7f5-c296-41ac-ae0d-5845c66a385a","Type":"ContainerDied","Data":"548509aa0f1e8c3468f071ff1b05af0aabf6c1afd7c27047c4f22b0be5b05951"} Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.271857 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="548509aa0f1e8c3468f071ff1b05af0aabf6c1afd7c27047c4f22b0be5b05951" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.271426 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.378018 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq"] Mar 09 14:07:03 crc kubenswrapper[4764]: E0309 14:07:03.378523 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c3951f-6e8b-46f4-9332-9c5d658862e4" containerName="oc" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.378546 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c3951f-6e8b-46f4-9332-9c5d658862e4" containerName="oc" Mar 09 14:07:03 crc kubenswrapper[4764]: E0309 14:07:03.378578 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a9ed7f5-c296-41ac-ae0d-5845c66a385a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.378585 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a9ed7f5-c296-41ac-ae0d-5845c66a385a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.378789 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="55c3951f-6e8b-46f4-9332-9c5d658862e4" containerName="oc" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.378808 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a9ed7f5-c296-41ac-ae0d-5845c66a385a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.379569 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.384868 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.384999 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.385226 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.385338 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.385508 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.385555 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.385697 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.385763 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.385901 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.397326 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq"] Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576240 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c6xn\" (UniqueName: \"kubernetes.io/projected/eab144b6-e27c-4ffc-9dd5-6236ca12719f-kube-api-access-7c6xn\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576297 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576328 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576356 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576378 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576419 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576451 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576478 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576515 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576553 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576583 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576605 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576621 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.678777 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c6xn\" (UniqueName: \"kubernetes.io/projected/eab144b6-e27c-4ffc-9dd5-6236ca12719f-kube-api-access-7c6xn\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.678859 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.678899 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.678953 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.678981 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.679032 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.679104 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.679143 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.679252 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.679294 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.679334 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.679358 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.679384 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.681390 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.681461 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.685156 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.685334 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.685455 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.685988 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.686570 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.686616 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.686624 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.687581 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.687855 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.693848 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.696516 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c6xn\" (UniqueName: \"kubernetes.io/projected/eab144b6-e27c-4ffc-9dd5-6236ca12719f-kube-api-access-7c6xn\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.700462 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:04 crc kubenswrapper[4764]: I0309 14:07:04.247412 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq"] Mar 09 14:07:04 crc kubenswrapper[4764]: I0309 14:07:04.253486 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:07:04 crc kubenswrapper[4764]: I0309 14:07:04.282143 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" event={"ID":"eab144b6-e27c-4ffc-9dd5-6236ca12719f","Type":"ContainerStarted","Data":"764af20acdc888d995ca2db68f3ab58c7fcd79d0559f4ce7bd21b3b2183c8e62"} Mar 09 14:07:05 crc kubenswrapper[4764]: I0309 14:07:05.293952 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" event={"ID":"eab144b6-e27c-4ffc-9dd5-6236ca12719f","Type":"ContainerStarted","Data":"86fa8795ecd86c7b9c42df1c80c762e511f08dce7f9815c18841bff01c585b45"} Mar 09 14:07:05 crc kubenswrapper[4764]: I0309 14:07:05.320029 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" podStartSLOduration=1.590808214 podStartE2EDuration="2.320001905s" podCreationTimestamp="2026-03-09 14:07:03 +0000 UTC" firstStartedPulling="2026-03-09 14:07:04.2530766 +0000 UTC m=+2779.503248518" lastFinishedPulling="2026-03-09 14:07:04.982270301 +0000 UTC m=+2780.232442209" observedRunningTime="2026-03-09 14:07:05.314713494 +0000 UTC m=+2780.564885432" watchObservedRunningTime="2026-03-09 14:07:05.320001905 +0000 UTC m=+2780.570173823" Mar 09 14:07:28 crc kubenswrapper[4764]: I0309 14:07:28.370239 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:07:28 crc kubenswrapper[4764]: I0309 14:07:28.370822 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:07:28 crc kubenswrapper[4764]: I0309 14:07:28.370878 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 14:07:28 crc kubenswrapper[4764]: I0309 14:07:28.371752 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"63776e57dad14b6a74a94432519b0b437cfc2082448e28e0a32706fa439569a0"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:07:28 crc kubenswrapper[4764]: I0309 14:07:28.371807 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://63776e57dad14b6a74a94432519b0b437cfc2082448e28e0a32706fa439569a0" gracePeriod=600 Mar 09 14:07:28 crc kubenswrapper[4764]: I0309 14:07:28.524245 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="63776e57dad14b6a74a94432519b0b437cfc2082448e28e0a32706fa439569a0" exitCode=0 Mar 09 14:07:28 crc kubenswrapper[4764]: I0309 14:07:28.524568 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"63776e57dad14b6a74a94432519b0b437cfc2082448e28e0a32706fa439569a0"} Mar 09 14:07:28 crc kubenswrapper[4764]: I0309 14:07:28.525124 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:07:29 crc kubenswrapper[4764]: I0309 14:07:29.537128 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb"} Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.146283 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551088-wtbvc"] Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.148272 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551088-wtbvc" Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.154162 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.154586 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.154181 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.162088 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551088-wtbvc"] Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.207739 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rczx5\" (UniqueName: \"kubernetes.io/projected/902ae1d9-a43c-46c6-a492-10ee0242e721-kube-api-access-rczx5\") pod \"auto-csr-approver-29551088-wtbvc\" (UID: \"902ae1d9-a43c-46c6-a492-10ee0242e721\") " pod="openshift-infra/auto-csr-approver-29551088-wtbvc" Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.309574 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczx5\" (UniqueName: \"kubernetes.io/projected/902ae1d9-a43c-46c6-a492-10ee0242e721-kube-api-access-rczx5\") pod \"auto-csr-approver-29551088-wtbvc\" (UID: \"902ae1d9-a43c-46c6-a492-10ee0242e721\") " pod="openshift-infra/auto-csr-approver-29551088-wtbvc" Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.347181 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczx5\" (UniqueName: \"kubernetes.io/projected/902ae1d9-a43c-46c6-a492-10ee0242e721-kube-api-access-rczx5\") pod \"auto-csr-approver-29551088-wtbvc\" (UID: \"902ae1d9-a43c-46c6-a492-10ee0242e721\") " pod="openshift-infra/auto-csr-approver-29551088-wtbvc" Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.525484 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551088-wtbvc" Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.997309 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551088-wtbvc"] Mar 09 14:08:01 crc kubenswrapper[4764]: I0309 14:08:01.865832 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551088-wtbvc" event={"ID":"902ae1d9-a43c-46c6-a492-10ee0242e721","Type":"ContainerStarted","Data":"c14e088b1d022613970d0112683f438fb3a8b48c6452e709953c210fc2585b9e"} Mar 09 14:08:02 crc kubenswrapper[4764]: I0309 14:08:02.878866 4764 generic.go:334] "Generic (PLEG): container finished" podID="902ae1d9-a43c-46c6-a492-10ee0242e721" containerID="20854488fc265f9f6a154273e0d45920e3a458753547063958f0c25388b08a64" exitCode=0 Mar 09 14:08:02 crc kubenswrapper[4764]: I0309 14:08:02.878933 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551088-wtbvc" event={"ID":"902ae1d9-a43c-46c6-a492-10ee0242e721","Type":"ContainerDied","Data":"20854488fc265f9f6a154273e0d45920e3a458753547063958f0c25388b08a64"} Mar 09 14:08:04 crc kubenswrapper[4764]: I0309 14:08:04.269251 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551088-wtbvc" Mar 09 14:08:04 crc kubenswrapper[4764]: I0309 14:08:04.401200 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rczx5\" (UniqueName: \"kubernetes.io/projected/902ae1d9-a43c-46c6-a492-10ee0242e721-kube-api-access-rczx5\") pod \"902ae1d9-a43c-46c6-a492-10ee0242e721\" (UID: \"902ae1d9-a43c-46c6-a492-10ee0242e721\") " Mar 09 14:08:04 crc kubenswrapper[4764]: I0309 14:08:04.411176 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/902ae1d9-a43c-46c6-a492-10ee0242e721-kube-api-access-rczx5" (OuterVolumeSpecName: "kube-api-access-rczx5") pod "902ae1d9-a43c-46c6-a492-10ee0242e721" (UID: "902ae1d9-a43c-46c6-a492-10ee0242e721"). InnerVolumeSpecName "kube-api-access-rczx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:08:04 crc kubenswrapper[4764]: I0309 14:08:04.503158 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rczx5\" (UniqueName: \"kubernetes.io/projected/902ae1d9-a43c-46c6-a492-10ee0242e721-kube-api-access-rczx5\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:04 crc kubenswrapper[4764]: I0309 14:08:04.901807 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551088-wtbvc" event={"ID":"902ae1d9-a43c-46c6-a492-10ee0242e721","Type":"ContainerDied","Data":"c14e088b1d022613970d0112683f438fb3a8b48c6452e709953c210fc2585b9e"} Mar 09 14:08:04 crc kubenswrapper[4764]: I0309 14:08:04.901857 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c14e088b1d022613970d0112683f438fb3a8b48c6452e709953c210fc2585b9e" Mar 09 14:08:04 crc kubenswrapper[4764]: I0309 14:08:04.902033 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551088-wtbvc" Mar 09 14:08:05 crc kubenswrapper[4764]: I0309 14:08:05.357494 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551082-9pffj"] Mar 09 14:08:05 crc kubenswrapper[4764]: I0309 14:08:05.368127 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551082-9pffj"] Mar 09 14:08:05 crc kubenswrapper[4764]: I0309 14:08:05.576319 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbc0f639-3ece-4df6-bbaa-af1572005872" path="/var/lib/kubelet/pods/cbc0f639-3ece-4df6-bbaa-af1572005872/volumes" Mar 09 14:08:09 crc kubenswrapper[4764]: I0309 14:08:09.629568 4764 scope.go:117] "RemoveContainer" containerID="083b9151b9954604db8e0ffa63a89b0a3dc0a267db79fa28e9e68016aa4eee5c" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.741936 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wfpcl"] Mar 09 14:08:30 crc kubenswrapper[4764]: E0309 14:08:30.743957 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="902ae1d9-a43c-46c6-a492-10ee0242e721" containerName="oc" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.744004 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="902ae1d9-a43c-46c6-a492-10ee0242e721" containerName="oc" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.744838 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="902ae1d9-a43c-46c6-a492-10ee0242e721" containerName="oc" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.748183 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.762480 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wfpcl"] Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.853730 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-catalog-content\") pod \"community-operators-wfpcl\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.853872 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnxnf\" (UniqueName: \"kubernetes.io/projected/5f47d85f-63cb-45ce-b935-1b2a534523dc-kube-api-access-wnxnf\") pod \"community-operators-wfpcl\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.853936 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-utilities\") pod \"community-operators-wfpcl\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.955617 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-utilities\") pod \"community-operators-wfpcl\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.955799 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-catalog-content\") pod \"community-operators-wfpcl\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.955941 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnxnf\" (UniqueName: \"kubernetes.io/projected/5f47d85f-63cb-45ce-b935-1b2a534523dc-kube-api-access-wnxnf\") pod \"community-operators-wfpcl\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.956411 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-catalog-content\") pod \"community-operators-wfpcl\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.956876 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-utilities\") pod \"community-operators-wfpcl\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.987748 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnxnf\" (UniqueName: \"kubernetes.io/projected/5f47d85f-63cb-45ce-b935-1b2a534523dc-kube-api-access-wnxnf\") pod \"community-operators-wfpcl\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:31 crc kubenswrapper[4764]: I0309 14:08:31.095904 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:31 crc kubenswrapper[4764]: I0309 14:08:31.701669 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wfpcl"] Mar 09 14:08:32 crc kubenswrapper[4764]: I0309 14:08:32.157438 4764 generic.go:334] "Generic (PLEG): container finished" podID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerID="bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03" exitCode=0 Mar 09 14:08:32 crc kubenswrapper[4764]: I0309 14:08:32.157571 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfpcl" event={"ID":"5f47d85f-63cb-45ce-b935-1b2a534523dc","Type":"ContainerDied","Data":"bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03"} Mar 09 14:08:32 crc kubenswrapper[4764]: I0309 14:08:32.157956 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfpcl" event={"ID":"5f47d85f-63cb-45ce-b935-1b2a534523dc","Type":"ContainerStarted","Data":"cfc0c125ad4a9bdc4d2c39a4d4e59044968a7fa9e65b6f1d8fca1ba7dfcb96b5"} Mar 09 14:08:33 crc kubenswrapper[4764]: I0309 14:08:33.176349 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfpcl" event={"ID":"5f47d85f-63cb-45ce-b935-1b2a534523dc","Type":"ContainerStarted","Data":"5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd"} Mar 09 14:08:34 crc kubenswrapper[4764]: I0309 14:08:34.187347 4764 generic.go:334] "Generic (PLEG): container finished" podID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerID="5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd" exitCode=0 Mar 09 14:08:34 crc kubenswrapper[4764]: I0309 14:08:34.187406 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfpcl" event={"ID":"5f47d85f-63cb-45ce-b935-1b2a534523dc","Type":"ContainerDied","Data":"5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd"} Mar 09 14:08:35 crc kubenswrapper[4764]: I0309 14:08:35.201508 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfpcl" event={"ID":"5f47d85f-63cb-45ce-b935-1b2a534523dc","Type":"ContainerStarted","Data":"da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5"} Mar 09 14:08:35 crc kubenswrapper[4764]: I0309 14:08:35.230521 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wfpcl" podStartSLOduration=2.786725282 podStartE2EDuration="5.230492851s" podCreationTimestamp="2026-03-09 14:08:30 +0000 UTC" firstStartedPulling="2026-03-09 14:08:32.162583547 +0000 UTC m=+2867.412755455" lastFinishedPulling="2026-03-09 14:08:34.606351116 +0000 UTC m=+2869.856523024" observedRunningTime="2026-03-09 14:08:35.22145231 +0000 UTC m=+2870.471624218" watchObservedRunningTime="2026-03-09 14:08:35.230492851 +0000 UTC m=+2870.480664759" Mar 09 14:08:41 crc kubenswrapper[4764]: I0309 14:08:41.097043 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:41 crc kubenswrapper[4764]: I0309 14:08:41.097912 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:41 crc kubenswrapper[4764]: I0309 14:08:41.146292 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:41 crc kubenswrapper[4764]: I0309 14:08:41.324233 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:41 crc kubenswrapper[4764]: I0309 14:08:41.904230 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vpnh7"] Mar 09 14:08:41 crc kubenswrapper[4764]: I0309 14:08:41.907070 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:41 crc kubenswrapper[4764]: I0309 14:08:41.933899 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vpnh7"] Mar 09 14:08:41 crc kubenswrapper[4764]: I0309 14:08:41.956023 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-utilities\") pod \"redhat-operators-vpnh7\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:41 crc kubenswrapper[4764]: I0309 14:08:41.956148 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d46th\" (UniqueName: \"kubernetes.io/projected/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-kube-api-access-d46th\") pod \"redhat-operators-vpnh7\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:41 crc kubenswrapper[4764]: I0309 14:08:41.956213 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-catalog-content\") pod \"redhat-operators-vpnh7\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:42 crc kubenswrapper[4764]: I0309 14:08:42.058542 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-utilities\") pod \"redhat-operators-vpnh7\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:42 crc kubenswrapper[4764]: I0309 14:08:42.058988 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d46th\" (UniqueName: \"kubernetes.io/projected/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-kube-api-access-d46th\") pod \"redhat-operators-vpnh7\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:42 crc kubenswrapper[4764]: I0309 14:08:42.059196 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-utilities\") pod \"redhat-operators-vpnh7\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:42 crc kubenswrapper[4764]: I0309 14:08:42.059386 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-catalog-content\") pod \"redhat-operators-vpnh7\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:42 crc kubenswrapper[4764]: I0309 14:08:42.059931 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-catalog-content\") pod \"redhat-operators-vpnh7\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:42 crc kubenswrapper[4764]: I0309 14:08:42.081735 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d46th\" (UniqueName: \"kubernetes.io/projected/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-kube-api-access-d46th\") pod \"redhat-operators-vpnh7\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:42 crc kubenswrapper[4764]: I0309 14:08:42.235494 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:42 crc kubenswrapper[4764]: I0309 14:08:42.757079 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vpnh7"] Mar 09 14:08:43 crc kubenswrapper[4764]: I0309 14:08:43.288894 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wfpcl"] Mar 09 14:08:43 crc kubenswrapper[4764]: I0309 14:08:43.289882 4764 generic.go:334] "Generic (PLEG): container finished" podID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerID="1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522" exitCode=0 Mar 09 14:08:43 crc kubenswrapper[4764]: I0309 14:08:43.290015 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpnh7" event={"ID":"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc","Type":"ContainerDied","Data":"1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522"} Mar 09 14:08:43 crc kubenswrapper[4764]: I0309 14:08:43.290085 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpnh7" event={"ID":"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc","Type":"ContainerStarted","Data":"2afe4d1e32eda8deed06dfb789128c9ea6b154f3a0227143cbe819cb6c855247"} Mar 09 14:08:44 crc kubenswrapper[4764]: I0309 14:08:44.302748 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wfpcl" podUID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerName="registry-server" containerID="cri-o://da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5" gracePeriod=2 Mar 09 14:08:44 crc kubenswrapper[4764]: I0309 14:08:44.794803 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:44 crc kubenswrapper[4764]: I0309 14:08:44.925574 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-catalog-content\") pod \"5f47d85f-63cb-45ce-b935-1b2a534523dc\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " Mar 09 14:08:44 crc kubenswrapper[4764]: I0309 14:08:44.926016 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnxnf\" (UniqueName: \"kubernetes.io/projected/5f47d85f-63cb-45ce-b935-1b2a534523dc-kube-api-access-wnxnf\") pod \"5f47d85f-63cb-45ce-b935-1b2a534523dc\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " Mar 09 14:08:44 crc kubenswrapper[4764]: I0309 14:08:44.926113 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-utilities\") pod \"5f47d85f-63cb-45ce-b935-1b2a534523dc\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " Mar 09 14:08:44 crc kubenswrapper[4764]: I0309 14:08:44.927606 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-utilities" (OuterVolumeSpecName: "utilities") pod "5f47d85f-63cb-45ce-b935-1b2a534523dc" (UID: "5f47d85f-63cb-45ce-b935-1b2a534523dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:08:44 crc kubenswrapper[4764]: I0309 14:08:44.936068 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f47d85f-63cb-45ce-b935-1b2a534523dc-kube-api-access-wnxnf" (OuterVolumeSpecName: "kube-api-access-wnxnf") pod "5f47d85f-63cb-45ce-b935-1b2a534523dc" (UID: "5f47d85f-63cb-45ce-b935-1b2a534523dc"). InnerVolumeSpecName "kube-api-access-wnxnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:08:44 crc kubenswrapper[4764]: I0309 14:08:44.980342 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f47d85f-63cb-45ce-b935-1b2a534523dc" (UID: "5f47d85f-63cb-45ce-b935-1b2a534523dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.029215 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.029285 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.029306 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnxnf\" (UniqueName: \"kubernetes.io/projected/5f47d85f-63cb-45ce-b935-1b2a534523dc-kube-api-access-wnxnf\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.320874 4764 generic.go:334] "Generic (PLEG): container finished" podID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerID="da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5" exitCode=0 Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.320959 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfpcl" event={"ID":"5f47d85f-63cb-45ce-b935-1b2a534523dc","Type":"ContainerDied","Data":"da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5"} Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.320998 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfpcl" event={"ID":"5f47d85f-63cb-45ce-b935-1b2a534523dc","Type":"ContainerDied","Data":"cfc0c125ad4a9bdc4d2c39a4d4e59044968a7fa9e65b6f1d8fca1ba7dfcb96b5"} Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.321029 4764 scope.go:117] "RemoveContainer" containerID="da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.321247 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.326922 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpnh7" event={"ID":"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc","Type":"ContainerStarted","Data":"dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84"} Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.385291 4764 scope.go:117] "RemoveContainer" containerID="5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.394088 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wfpcl"] Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.408245 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wfpcl"] Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.417827 4764 scope.go:117] "RemoveContainer" containerID="bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.470050 4764 scope.go:117] "RemoveContainer" containerID="da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5" Mar 09 14:08:45 crc kubenswrapper[4764]: E0309 14:08:45.470896 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5\": container with ID starting with da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5 not found: ID does not exist" containerID="da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.470957 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5"} err="failed to get container status \"da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5\": rpc error: code = NotFound desc = could not find container \"da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5\": container with ID starting with da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5 not found: ID does not exist" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.470994 4764 scope.go:117] "RemoveContainer" containerID="5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd" Mar 09 14:08:45 crc kubenswrapper[4764]: E0309 14:08:45.472236 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd\": container with ID starting with 5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd not found: ID does not exist" containerID="5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.472268 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd"} err="failed to get container status \"5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd\": rpc error: code = NotFound desc = could not find container \"5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd\": container with ID starting with 5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd not found: ID does not exist" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.472293 4764 scope.go:117] "RemoveContainer" containerID="bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03" Mar 09 14:08:45 crc kubenswrapper[4764]: E0309 14:08:45.472992 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03\": container with ID starting with bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03 not found: ID does not exist" containerID="bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.473025 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03"} err="failed to get container status \"bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03\": rpc error: code = NotFound desc = could not find container \"bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03\": container with ID starting with bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03 not found: ID does not exist" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.575475 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f47d85f-63cb-45ce-b935-1b2a534523dc" path="/var/lib/kubelet/pods/5f47d85f-63cb-45ce-b935-1b2a534523dc/volumes" Mar 09 14:08:46 crc kubenswrapper[4764]: I0309 14:08:46.348585 4764 generic.go:334] "Generic (PLEG): container finished" podID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerID="dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84" exitCode=0 Mar 09 14:08:46 crc kubenswrapper[4764]: I0309 14:08:46.348694 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpnh7" event={"ID":"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc","Type":"ContainerDied","Data":"dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84"} Mar 09 14:08:46 crc kubenswrapper[4764]: I0309 14:08:46.348766 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpnh7" event={"ID":"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc","Type":"ContainerStarted","Data":"a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323"} Mar 09 14:08:46 crc kubenswrapper[4764]: I0309 14:08:46.376037 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vpnh7" podStartSLOduration=2.919296757 podStartE2EDuration="5.376017622s" podCreationTimestamp="2026-03-09 14:08:41 +0000 UTC" firstStartedPulling="2026-03-09 14:08:43.29208592 +0000 UTC m=+2878.542257828" lastFinishedPulling="2026-03-09 14:08:45.748806785 +0000 UTC m=+2880.998978693" observedRunningTime="2026-03-09 14:08:46.372063266 +0000 UTC m=+2881.622235214" watchObservedRunningTime="2026-03-09 14:08:46.376017622 +0000 UTC m=+2881.626189530" Mar 09 14:08:52 crc kubenswrapper[4764]: I0309 14:08:52.263886 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:52 crc kubenswrapper[4764]: I0309 14:08:52.283893 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:52 crc kubenswrapper[4764]: I0309 14:08:52.333988 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:52 crc kubenswrapper[4764]: I0309 14:08:52.459629 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:52 crc kubenswrapper[4764]: I0309 14:08:52.576549 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vpnh7"] Mar 09 14:08:54 crc kubenswrapper[4764]: I0309 14:08:54.424766 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vpnh7" podUID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerName="registry-server" containerID="cri-o://a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323" gracePeriod=2 Mar 09 14:08:54 crc kubenswrapper[4764]: I0309 14:08:54.928688 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:54 crc kubenswrapper[4764]: I0309 14:08:54.961225 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-utilities\") pod \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " Mar 09 14:08:54 crc kubenswrapper[4764]: I0309 14:08:54.961403 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-catalog-content\") pod \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " Mar 09 14:08:54 crc kubenswrapper[4764]: I0309 14:08:54.961514 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d46th\" (UniqueName: \"kubernetes.io/projected/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-kube-api-access-d46th\") pod \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " Mar 09 14:08:54 crc kubenswrapper[4764]: I0309 14:08:54.963609 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-utilities" (OuterVolumeSpecName: "utilities") pod "d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" (UID: "d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:08:54 crc kubenswrapper[4764]: I0309 14:08:54.970553 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-kube-api-access-d46th" (OuterVolumeSpecName: "kube-api-access-d46th") pod "d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" (UID: "d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc"). InnerVolumeSpecName "kube-api-access-d46th". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.063792 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d46th\" (UniqueName: \"kubernetes.io/projected/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-kube-api-access-d46th\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.063833 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.437537 4764 generic.go:334] "Generic (PLEG): container finished" podID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerID="a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323" exitCode=0 Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.437593 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpnh7" event={"ID":"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc","Type":"ContainerDied","Data":"a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323"} Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.437634 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpnh7" event={"ID":"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc","Type":"ContainerDied","Data":"2afe4d1e32eda8deed06dfb789128c9ea6b154f3a0227143cbe819cb6c855247"} Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.437630 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.437675 4764 scope.go:117] "RemoveContainer" containerID="a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.458352 4764 scope.go:117] "RemoveContainer" containerID="dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.482306 4764 scope.go:117] "RemoveContainer" containerID="1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.520726 4764 scope.go:117] "RemoveContainer" containerID="a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323" Mar 09 14:08:55 crc kubenswrapper[4764]: E0309 14:08:55.521641 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323\": container with ID starting with a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323 not found: ID does not exist" containerID="a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.521724 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323"} err="failed to get container status \"a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323\": rpc error: code = NotFound desc = could not find container \"a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323\": container with ID starting with a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323 not found: ID does not exist" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.521762 4764 scope.go:117] "RemoveContainer" containerID="dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84" Mar 09 14:08:55 crc kubenswrapper[4764]: E0309 14:08:55.522538 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84\": container with ID starting with dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84 not found: ID does not exist" containerID="dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.522669 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84"} err="failed to get container status \"dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84\": rpc error: code = NotFound desc = could not find container \"dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84\": container with ID starting with dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84 not found: ID does not exist" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.522759 4764 scope.go:117] "RemoveContainer" containerID="1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522" Mar 09 14:08:55 crc kubenswrapper[4764]: E0309 14:08:55.523502 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522\": container with ID starting with 1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522 not found: ID does not exist" containerID="1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.523585 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522"} err="failed to get container status \"1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522\": rpc error: code = NotFound desc = could not find container \"1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522\": container with ID starting with 1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522 not found: ID does not exist" Mar 09 14:08:56 crc kubenswrapper[4764]: I0309 14:08:56.278922 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" (UID: "d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:08:56 crc kubenswrapper[4764]: I0309 14:08:56.290774 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:56 crc kubenswrapper[4764]: I0309 14:08:56.375607 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vpnh7"] Mar 09 14:08:56 crc kubenswrapper[4764]: I0309 14:08:56.384538 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vpnh7"] Mar 09 14:08:57 crc kubenswrapper[4764]: I0309 14:08:57.572617 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" path="/var/lib/kubelet/pods/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc/volumes" Mar 09 14:09:24 crc kubenswrapper[4764]: I0309 14:09:24.734888 4764 generic.go:334] "Generic (PLEG): container finished" podID="eab144b6-e27c-4ffc-9dd5-6236ca12719f" containerID="86fa8795ecd86c7b9c42df1c80c762e511f08dce7f9815c18841bff01c585b45" exitCode=0 Mar 09 14:09:24 crc kubenswrapper[4764]: I0309 14:09:24.734996 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" event={"ID":"eab144b6-e27c-4ffc-9dd5-6236ca12719f","Type":"ContainerDied","Data":"86fa8795ecd86c7b9c42df1c80c762e511f08dce7f9815c18841bff01c585b45"} Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.185190 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288358 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-1\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288463 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c6xn\" (UniqueName: \"kubernetes.io/projected/eab144b6-e27c-4ffc-9dd5-6236ca12719f-kube-api-access-7c6xn\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288516 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ssh-key-openstack-edpm-ipam\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288668 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-extra-config-0\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288763 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-2\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288785 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-inventory\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288808 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-0\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288850 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-custom-ceph-combined-ca-bundle\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288887 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph-nova-0\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288919 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-1\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288944 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-3\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288961 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-0\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288985 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.302660 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.302830 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph" (OuterVolumeSpecName: "ceph") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.303846 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eab144b6-e27c-4ffc-9dd5-6236ca12719f-kube-api-access-7c6xn" (OuterVolumeSpecName: "kube-api-access-7c6xn") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "kube-api-access-7c6xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.323453 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.324028 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.328209 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.330300 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-inventory" (OuterVolumeSpecName: "inventory") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.335872 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.341702 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.341839 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.345110 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.345603 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.349543 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391186 4764 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391237 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391251 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391260 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391269 4764 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391282 4764 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391292 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391302 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391310 4764 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391319 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391327 4764 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391358 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c6xn\" (UniqueName: \"kubernetes.io/projected/eab144b6-e27c-4ffc-9dd5-6236ca12719f-kube-api-access-7c6xn\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391370 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.758078 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" event={"ID":"eab144b6-e27c-4ffc-9dd5-6236ca12719f","Type":"ContainerDied","Data":"764af20acdc888d995ca2db68f3ab58c7fcd79d0559f4ce7bd21b3b2183c8e62"} Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.758420 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="764af20acdc888d995ca2db68f3ab58c7fcd79d0559f4ce7bd21b3b2183c8e62" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.758135 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:09:28 crc kubenswrapper[4764]: I0309 14:09:28.370390 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:09:28 crc kubenswrapper[4764]: I0309 14:09:28.370912 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.985481 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 09 14:09:40 crc kubenswrapper[4764]: E0309 14:09:40.986933 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerName="extract-content" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.986954 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerName="extract-content" Mar 09 14:09:40 crc kubenswrapper[4764]: E0309 14:09:40.986965 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerName="extract-utilities" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.986971 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerName="extract-utilities" Mar 09 14:09:40 crc kubenswrapper[4764]: E0309 14:09:40.986987 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eab144b6-e27c-4ffc-9dd5-6236ca12719f" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.986999 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab144b6-e27c-4ffc-9dd5-6236ca12719f" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 09 14:09:40 crc kubenswrapper[4764]: E0309 14:09:40.987026 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerName="registry-server" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.987034 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerName="registry-server" Mar 09 14:09:40 crc kubenswrapper[4764]: E0309 14:09:40.987046 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerName="extract-content" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.987052 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerName="extract-content" Mar 09 14:09:40 crc kubenswrapper[4764]: E0309 14:09:40.987069 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerName="registry-server" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.987076 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerName="registry-server" Mar 09 14:09:40 crc kubenswrapper[4764]: E0309 14:09:40.987089 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerName="extract-utilities" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.987096 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerName="extract-utilities" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.987317 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="eab144b6-e27c-4ffc-9dd5-6236ca12719f" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.987336 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerName="registry-server" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.987348 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerName="registry-server" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.988460 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.991240 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.992147 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.007226 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.054280 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.056218 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.065175 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.068807 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.132586 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2kwv\" (UniqueName: \"kubernetes.io/projected/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-kube-api-access-b2kwv\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.132752 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.132806 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.132834 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.132885 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.132918 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.132978 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.133013 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-run\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.133049 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.133073 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.133093 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-sys\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.133137 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.133211 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.133238 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.133286 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-dev\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.133308 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.235743 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.235823 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.235848 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.235874 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.235903 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.235946 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.235987 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-run\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.236007 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.236032 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.236058 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-sys\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.236087 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.236162 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.236193 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.236230 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-dev\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.236257 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.236301 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2kwv\" (UniqueName: \"kubernetes.io/projected/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-kube-api-access-b2kwv\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.243984 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.244060 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.246453 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.248567 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.248792 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.248864 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.248887 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-run\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.249101 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.249153 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.249178 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-sys\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.249199 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.252928 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.253074 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.253113 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-dev\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.256006 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.260132 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2kwv\" (UniqueName: \"kubernetes.io/projected/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-kube-api-access-b2kwv\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.345148 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.022312 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-run\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.023319 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw5m9\" (UniqueName: \"kubernetes.io/projected/e6a8f674-82eb-4474-973d-54a90e5fd1e0-kube-api-access-nw5m9\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.023487 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-dev\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.057394 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e6a8f674-82eb-4474-973d-54a90e5fd1e0-ceph\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.249484 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.253528 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.253611 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-scripts\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.253641 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.253797 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.253868 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.253918 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.253949 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-config-data\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.254130 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.254210 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.254253 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-lib-modules\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.254290 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-sys\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.287092 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.289584 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.312577 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8p7c7" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.313016 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.313212 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.313406 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.323307 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.338432 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.339583 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.352489 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.353037 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.356549 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.356659 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.356695 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-scripts\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.356721 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.356788 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.356818 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.356849 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.356874 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-config-data\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.356938 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.356976 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.357009 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-lib-modules\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.357788 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-sys\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.357889 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-run\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.357949 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw5m9\" (UniqueName: \"kubernetes.io/projected/e6a8f674-82eb-4474-973d-54a90e5fd1e0-kube-api-access-nw5m9\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.358020 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-dev\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.358081 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e6a8f674-82eb-4474-973d-54a90e5fd1e0-ceph\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.362108 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.362774 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-run\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.368246 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.368312 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.368333 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-dev\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.368527 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-lib-modules\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.368558 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-sys\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.368612 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.368674 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.368710 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.373118 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e6a8f674-82eb-4474-973d-54a90e5fd1e0-ceph\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.387156 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-config-data\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.388060 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.409607 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw5m9\" (UniqueName: \"kubernetes.io/projected/e6a8f674-82eb-4474-973d-54a90e5fd1e0-kube-api-access-nw5m9\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.410299 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-scripts\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.422949 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.426329 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463104 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-config-data\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463168 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t465\" (UniqueName: \"kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-kube-api-access-4t465\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463205 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-ceph\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463264 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463284 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-scripts\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463332 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463379 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463408 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463430 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463473 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463492 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463511 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463535 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463570 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463590 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxdzv\" (UniqueName: \"kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-kube-api-access-kxdzv\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463619 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463662 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463677 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-logs\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572196 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572283 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572310 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572340 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572377 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572410 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxdzv\" (UniqueName: \"kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-kube-api-access-kxdzv\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572449 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572479 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572498 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-logs\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572520 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-config-data\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572546 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t465\" (UniqueName: \"kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-kube-api-access-4t465\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572581 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-ceph\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572676 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572703 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-scripts\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572764 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572814 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572848 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572880 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.577512 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.578022 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.583280 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-logs\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.583444 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.588220 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.589548 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.592486 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.599885 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.600538 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.618031 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.620285 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.621504 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-config-data\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.622374 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.623839 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t465\" (UniqueName: \"kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-kube-api-access-4t465\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.624439 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-ceph\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.624630 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-scripts\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.627568 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.633342 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.705446 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxdzv\" (UniqueName: \"kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-kube-api-access-kxdzv\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.723930 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.772298 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.784779 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.953438 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.976931 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-fdg68"] Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.978370 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-fdg68" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:42.999230 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-fdg68"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.093480 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e1948a5-46f6-412d-91c3-bf9c255e02fc-operator-scripts\") pod \"manila-db-create-fdg68\" (UID: \"5e1948a5-46f6-412d-91c3-bf9c255e02fc\") " pod="openstack/manila-db-create-fdg68" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.094071 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnpqp\" (UniqueName: \"kubernetes.io/projected/5e1948a5-46f6-412d-91c3-bf9c255e02fc-kube-api-access-cnpqp\") pod \"manila-db-create-fdg68\" (UID: \"5e1948a5-46f6-412d-91c3-bf9c255e02fc\") " pod="openstack/manila-db-create-fdg68" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.137579 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-9d94c4c7-px8jh"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.139534 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.153863 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.154126 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.154294 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.157381 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-rwzsj" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.190810 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9d94c4c7-px8jh"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.198867 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktc9m\" (UniqueName: \"kubernetes.io/projected/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-kube-api-access-ktc9m\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.198939 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e1948a5-46f6-412d-91c3-bf9c255e02fc-operator-scripts\") pod \"manila-db-create-fdg68\" (UID: \"5e1948a5-46f6-412d-91c3-bf9c255e02fc\") " pod="openstack/manila-db-create-fdg68" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.198969 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnpqp\" (UniqueName: \"kubernetes.io/projected/5e1948a5-46f6-412d-91c3-bf9c255e02fc-kube-api-access-cnpqp\") pod \"manila-db-create-fdg68\" (UID: \"5e1948a5-46f6-412d-91c3-bf9c255e02fc\") " pod="openstack/manila-db-create-fdg68" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.198987 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-config-data\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.199037 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-scripts\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.199063 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-horizon-secret-key\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.199081 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-logs\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.207683 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e1948a5-46f6-412d-91c3-bf9c255e02fc-operator-scripts\") pod \"manila-db-create-fdg68\" (UID: \"5e1948a5-46f6-412d-91c3-bf9c255e02fc\") " pod="openstack/manila-db-create-fdg68" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.236544 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.264909 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnpqp\" (UniqueName: \"kubernetes.io/projected/5e1948a5-46f6-412d-91c3-bf9c255e02fc-kube-api-access-cnpqp\") pod \"manila-db-create-fdg68\" (UID: \"5e1948a5-46f6-412d-91c3-bf9c255e02fc\") " pod="openstack/manila-db-create-fdg68" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.307898 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-scripts\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.307978 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-horizon-secret-key\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.308004 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-logs\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.308201 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktc9m\" (UniqueName: \"kubernetes.io/projected/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-kube-api-access-ktc9m\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.312077 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-scripts\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.312168 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-logs\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.314172 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-config-data\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.315026 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-horizon-secret-key\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.317566 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-config-data\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.326802 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-fdg68" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.331726 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.391712 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktc9m\" (UniqueName: \"kubernetes.io/projected/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-kube-api-access-ktc9m\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.403814 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-1df0-account-create-update-bg564"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.405750 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1df0-account-create-update-bg564" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.411921 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.435457 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-1df0-account-create-update-bg564"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.450341 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-55f8b7fc4c-6rdd7"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.466731 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.480317 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55f8b7fc4c-6rdd7"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.489599 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.519023 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.524085 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlm9t\" (UniqueName: \"kubernetes.io/projected/6b944cf9-8278-4b16-b09c-0da6a2519b2a-kube-api-access-nlm9t\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.524177 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-operator-scripts\") pod \"manila-1df0-account-create-update-bg564\" (UID: \"7ce9bce5-9c23-40ac-9683-6fb232e32c3c\") " pod="openstack/manila-1df0-account-create-update-bg564" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.524221 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-config-data\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.524273 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-scripts\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.524320 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl52v\" (UniqueName: \"kubernetes.io/projected/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-kube-api-access-pl52v\") pod \"manila-1df0-account-create-update-bg564\" (UID: \"7ce9bce5-9c23-40ac-9683-6fb232e32c3c\") " pod="openstack/manila-1df0-account-create-update-bg564" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.524352 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b944cf9-8278-4b16-b09c-0da6a2519b2a-logs\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.524390 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b944cf9-8278-4b16-b09c-0da6a2519b2a-horizon-secret-key\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.626433 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl52v\" (UniqueName: \"kubernetes.io/projected/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-kube-api-access-pl52v\") pod \"manila-1df0-account-create-update-bg564\" (UID: \"7ce9bce5-9c23-40ac-9683-6fb232e32c3c\") " pod="openstack/manila-1df0-account-create-update-bg564" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.626504 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b944cf9-8278-4b16-b09c-0da6a2519b2a-logs\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.626541 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b944cf9-8278-4b16-b09c-0da6a2519b2a-horizon-secret-key\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.626682 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlm9t\" (UniqueName: \"kubernetes.io/projected/6b944cf9-8278-4b16-b09c-0da6a2519b2a-kube-api-access-nlm9t\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.626807 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-operator-scripts\") pod \"manila-1df0-account-create-update-bg564\" (UID: \"7ce9bce5-9c23-40ac-9683-6fb232e32c3c\") " pod="openstack/manila-1df0-account-create-update-bg564" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.626839 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-config-data\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.626908 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-scripts\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.627875 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-scripts\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.628510 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b944cf9-8278-4b16-b09c-0da6a2519b2a-logs\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.629391 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-operator-scripts\") pod \"manila-1df0-account-create-update-bg564\" (UID: \"7ce9bce5-9c23-40ac-9683-6fb232e32c3c\") " pod="openstack/manila-1df0-account-create-update-bg564" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.632411 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-config-data\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.643193 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b944cf9-8278-4b16-b09c-0da6a2519b2a-horizon-secret-key\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.695504 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl52v\" (UniqueName: \"kubernetes.io/projected/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-kube-api-access-pl52v\") pod \"manila-1df0-account-create-update-bg564\" (UID: \"7ce9bce5-9c23-40ac-9683-6fb232e32c3c\") " pod="openstack/manila-1df0-account-create-update-bg564" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.695933 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlm9t\" (UniqueName: \"kubernetes.io/projected/6b944cf9-8278-4b16-b09c-0da6a2519b2a-kube-api-access-nlm9t\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.831944 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.897431 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1df0-account-create-update-bg564" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.899332 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.924566 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:09:44 crc kubenswrapper[4764]: I0309 14:09:44.083823 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:09:44 crc kubenswrapper[4764]: I0309 14:09:44.206146 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-fdg68"] Mar 09 14:09:44 crc kubenswrapper[4764]: I0309 14:09:44.291564 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-fdg68" event={"ID":"5e1948a5-46f6-412d-91c3-bf9c255e02fc","Type":"ContainerStarted","Data":"2f209c4397604d7a0a9dbd934378982e63afb3fec9341073117892ee26e51dd6"} Mar 09 14:09:44 crc kubenswrapper[4764]: I0309 14:09:44.296715 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"349527e3-93e0-4342-845c-eb8775ab3e5a","Type":"ContainerStarted","Data":"6f8e9f0abe6ca5e7bff14c308c2db546c0b2c9b94e174f7fa80c547c31b16a26"} Mar 09 14:09:44 crc kubenswrapper[4764]: I0309 14:09:44.300820 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"49ee7179-02d8-4e07-9bb8-fce22456e804","Type":"ContainerStarted","Data":"1f84cee9a80ba50e44abce0cb213778729f5e3cb3d6103fdf6b0ebee436d5285"} Mar 09 14:09:44 crc kubenswrapper[4764]: I0309 14:09:44.302866 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f","Type":"ContainerStarted","Data":"1a5be9c37bf14ba0e2041bc44e4cd0834044e096c24171216c18d21b7820634e"} Mar 09 14:09:44 crc kubenswrapper[4764]: I0309 14:09:44.323291 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e6a8f674-82eb-4474-973d-54a90e5fd1e0","Type":"ContainerStarted","Data":"c70ee790255d3544035eda73ecba123e00e43636af92765b71d3c5750bfab7a8"} Mar 09 14:09:44 crc kubenswrapper[4764]: I0309 14:09:44.485691 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9d94c4c7-px8jh"] Mar 09 14:09:44 crc kubenswrapper[4764]: I0309 14:09:44.637623 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-1df0-account-create-update-bg564"] Mar 09 14:09:44 crc kubenswrapper[4764]: I0309 14:09:44.657304 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55f8b7fc4c-6rdd7"] Mar 09 14:09:44 crc kubenswrapper[4764]: W0309 14:09:44.795190 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b944cf9_8278_4b16_b09c_0da6a2519b2a.slice/crio-03eb026db669a1bbc8569cf6227719fa0346433ca9e38091ef450a1f5669648c WatchSource:0}: Error finding container 03eb026db669a1bbc8569cf6227719fa0346433ca9e38091ef450a1f5669648c: Status 404 returned error can't find the container with id 03eb026db669a1bbc8569cf6227719fa0346433ca9e38091ef450a1f5669648c Mar 09 14:09:45 crc kubenswrapper[4764]: I0309 14:09:45.344770 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9d94c4c7-px8jh" event={"ID":"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243","Type":"ContainerStarted","Data":"dc93d1c29f222a1602e24053b4830b3eddb16bbf1d9374aeb6068fdf3ba6030f"} Mar 09 14:09:45 crc kubenswrapper[4764]: I0309 14:09:45.349617 4764 generic.go:334] "Generic (PLEG): container finished" podID="5e1948a5-46f6-412d-91c3-bf9c255e02fc" containerID="89bd3940c282c4190fbd12e55d3b6572d76973664eb8e08171ebbd6ade38f50f" exitCode=0 Mar 09 14:09:45 crc kubenswrapper[4764]: I0309 14:09:45.349688 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-fdg68" event={"ID":"5e1948a5-46f6-412d-91c3-bf9c255e02fc","Type":"ContainerDied","Data":"89bd3940c282c4190fbd12e55d3b6572d76973664eb8e08171ebbd6ade38f50f"} Mar 09 14:09:45 crc kubenswrapper[4764]: I0309 14:09:45.351724 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55f8b7fc4c-6rdd7" event={"ID":"6b944cf9-8278-4b16-b09c-0da6a2519b2a","Type":"ContainerStarted","Data":"03eb026db669a1bbc8569cf6227719fa0346433ca9e38091ef450a1f5669648c"} Mar 09 14:09:45 crc kubenswrapper[4764]: I0309 14:09:45.355629 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"349527e3-93e0-4342-845c-eb8775ab3e5a","Type":"ContainerStarted","Data":"30776e46d343f3c10f72753cb59ea338812b698d41259546e8c5f506b57f2e53"} Mar 09 14:09:45 crc kubenswrapper[4764]: I0309 14:09:45.362674 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"49ee7179-02d8-4e07-9bb8-fce22456e804","Type":"ContainerStarted","Data":"07b51e4be4c9ba108c01cded8c1c80a772fa736483e292b6667b962df1606942"} Mar 09 14:09:45 crc kubenswrapper[4764]: I0309 14:09:45.367571 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-1df0-account-create-update-bg564" event={"ID":"7ce9bce5-9c23-40ac-9683-6fb232e32c3c","Type":"ContainerStarted","Data":"bdf07c58d276db116b39b558183c8a6af0bb01c84a705a0b57c15a4ac4f6b634"} Mar 09 14:09:45 crc kubenswrapper[4764]: I0309 14:09:45.367662 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-1df0-account-create-update-bg564" event={"ID":"7ce9bce5-9c23-40ac-9683-6fb232e32c3c","Type":"ContainerStarted","Data":"2002bb74ed38eb2c1f0e486f850b4e290555f1f76e89842042231f9e9946ffde"} Mar 09 14:09:45 crc kubenswrapper[4764]: I0309 14:09:45.411525 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-1df0-account-create-update-bg564" podStartSLOduration=2.411501615 podStartE2EDuration="2.411501615s" podCreationTimestamp="2026-03-09 14:09:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:09:45.401118068 +0000 UTC m=+2940.651289976" watchObservedRunningTime="2026-03-09 14:09:45.411501615 +0000 UTC m=+2940.661673523" Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.386634 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f","Type":"ContainerStarted","Data":"c18f47a7c064950f846a245ffd438b68e0328a931476ffa1517c78de39cd79df"} Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.389169 4764 generic.go:334] "Generic (PLEG): container finished" podID="7ce9bce5-9c23-40ac-9683-6fb232e32c3c" containerID="bdf07c58d276db116b39b558183c8a6af0bb01c84a705a0b57c15a4ac4f6b634" exitCode=0 Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.389219 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-1df0-account-create-update-bg564" event={"ID":"7ce9bce5-9c23-40ac-9683-6fb232e32c3c","Type":"ContainerDied","Data":"bdf07c58d276db116b39b558183c8a6af0bb01c84a705a0b57c15a4ac4f6b634"} Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.404125 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e6a8f674-82eb-4474-973d-54a90e5fd1e0","Type":"ContainerStarted","Data":"fa40eb8f95c156ce0a45864c5ed3bf173bfb2745018c72edc9ef389478dae117"} Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.759334 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55f8b7fc4c-6rdd7"] Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.851216 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-797d44c9b-wrlx7"] Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.866955 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.876578 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.893019 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-797d44c9b-wrlx7"] Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.922764 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9d94c4c7-px8jh"] Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.954858 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-config-data\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.954940 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-tls-certs\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.954988 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-secret-key\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.955041 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-scripts\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.955074 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-combined-ca-bundle\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.955117 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e00fc104-ec73-4190-a598-86de7ca6cfa5-logs\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.955158 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l5bd\" (UniqueName: \"kubernetes.io/projected/e00fc104-ec73-4190-a598-86de7ca6cfa5-kube-api-access-9l5bd\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.970914 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-fdg68" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.000732 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-56bb55c768-vchmw"] Mar 09 14:09:47 crc kubenswrapper[4764]: E0309 14:09:47.001438 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e1948a5-46f6-412d-91c3-bf9c255e02fc" containerName="mariadb-database-create" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.001466 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e1948a5-46f6-412d-91c3-bf9c255e02fc" containerName="mariadb-database-create" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.001728 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e1948a5-46f6-412d-91c3-bf9c255e02fc" containerName="mariadb-database-create" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.003209 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.029106 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56bb55c768-vchmw"] Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057099 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnpqp\" (UniqueName: \"kubernetes.io/projected/5e1948a5-46f6-412d-91c3-bf9c255e02fc-kube-api-access-cnpqp\") pod \"5e1948a5-46f6-412d-91c3-bf9c255e02fc\" (UID: \"5e1948a5-46f6-412d-91c3-bf9c255e02fc\") " Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057230 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e1948a5-46f6-412d-91c3-bf9c255e02fc-operator-scripts\") pod \"5e1948a5-46f6-412d-91c3-bf9c255e02fc\" (UID: \"5e1948a5-46f6-412d-91c3-bf9c255e02fc\") " Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057554 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l5bd\" (UniqueName: \"kubernetes.io/projected/e00fc104-ec73-4190-a598-86de7ca6cfa5-kube-api-access-9l5bd\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057609 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47ef29f6-4627-4b84-968d-db9d7ed438da-config-data\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057694 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzxxs\" (UniqueName: \"kubernetes.io/projected/47ef29f6-4627-4b84-968d-db9d7ed438da-kube-api-access-kzxxs\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057775 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ef29f6-4627-4b84-968d-db9d7ed438da-combined-ca-bundle\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057843 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47ef29f6-4627-4b84-968d-db9d7ed438da-logs\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057913 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-config-data\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057933 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-tls-certs\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057968 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47ef29f6-4627-4b84-968d-db9d7ed438da-scripts\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057998 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-secret-key\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.058073 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-scripts\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.058103 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/47ef29f6-4627-4b84-968d-db9d7ed438da-horizon-secret-key\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.058136 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-combined-ca-bundle\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.058168 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ef29f6-4627-4b84-968d-db9d7ed438da-horizon-tls-certs\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.058214 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e00fc104-ec73-4190-a598-86de7ca6cfa5-logs\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.058810 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e00fc104-ec73-4190-a598-86de7ca6cfa5-logs\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.060250 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-config-data\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.060859 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e1948a5-46f6-412d-91c3-bf9c255e02fc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5e1948a5-46f6-412d-91c3-bf9c255e02fc" (UID: "5e1948a5-46f6-412d-91c3-bf9c255e02fc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.066053 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-scripts\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.073244 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-secret-key\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.073669 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-combined-ca-bundle\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.080440 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l5bd\" (UniqueName: \"kubernetes.io/projected/e00fc104-ec73-4190-a598-86de7ca6cfa5-kube-api-access-9l5bd\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.107242 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-tls-certs\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.269203 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e1948a5-46f6-412d-91c3-bf9c255e02fc-kube-api-access-cnpqp" (OuterVolumeSpecName: "kube-api-access-cnpqp") pod "5e1948a5-46f6-412d-91c3-bf9c255e02fc" (UID: "5e1948a5-46f6-412d-91c3-bf9c255e02fc"). InnerVolumeSpecName "kube-api-access-cnpqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.271119 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.272056 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ef29f6-4627-4b84-968d-db9d7ed438da-combined-ca-bundle\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.272186 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47ef29f6-4627-4b84-968d-db9d7ed438da-logs\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.272318 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47ef29f6-4627-4b84-968d-db9d7ed438da-scripts\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.272429 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/47ef29f6-4627-4b84-968d-db9d7ed438da-horizon-secret-key\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.272498 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ef29f6-4627-4b84-968d-db9d7ed438da-horizon-tls-certs\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.272678 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47ef29f6-4627-4b84-968d-db9d7ed438da-config-data\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.272778 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzxxs\" (UniqueName: \"kubernetes.io/projected/47ef29f6-4627-4b84-968d-db9d7ed438da-kube-api-access-kzxxs\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.273822 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnpqp\" (UniqueName: \"kubernetes.io/projected/5e1948a5-46f6-412d-91c3-bf9c255e02fc-kube-api-access-cnpqp\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.277568 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e1948a5-46f6-412d-91c3-bf9c255e02fc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.281133 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47ef29f6-4627-4b84-968d-db9d7ed438da-scripts\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.283425 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47ef29f6-4627-4b84-968d-db9d7ed438da-logs\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.288374 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ef29f6-4627-4b84-968d-db9d7ed438da-horizon-tls-certs\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.289978 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47ef29f6-4627-4b84-968d-db9d7ed438da-config-data\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.300194 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ef29f6-4627-4b84-968d-db9d7ed438da-combined-ca-bundle\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.314273 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzxxs\" (UniqueName: \"kubernetes.io/projected/47ef29f6-4627-4b84-968d-db9d7ed438da-kube-api-access-kzxxs\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.322320 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/47ef29f6-4627-4b84-968d-db9d7ed438da-horizon-secret-key\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.588283 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.688398 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"349527e3-93e0-4342-845c-eb8775ab3e5a","Type":"ContainerStarted","Data":"ecd62f1b0eb5226209d3ecf6887c5a0f2d0feaa519397c875dd114ffc89d8b3b"} Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.688664 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="349527e3-93e0-4342-845c-eb8775ab3e5a" containerName="glance-log" containerID="cri-o://30776e46d343f3c10f72753cb59ea338812b698d41259546e8c5f506b57f2e53" gracePeriod=30 Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.688918 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="349527e3-93e0-4342-845c-eb8775ab3e5a" containerName="glance-httpd" containerID="cri-o://ecd62f1b0eb5226209d3ecf6887c5a0f2d0feaa519397c875dd114ffc89d8b3b" gracePeriod=30 Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.697295 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"49ee7179-02d8-4e07-9bb8-fce22456e804","Type":"ContainerStarted","Data":"1d3f3c09a65ee0d96c5761cf37afd174b0dd657fc73b2eb650e087f0fb7b89e4"} Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.697389 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="49ee7179-02d8-4e07-9bb8-fce22456e804" containerName="glance-log" containerID="cri-o://07b51e4be4c9ba108c01cded8c1c80a772fa736483e292b6667b962df1606942" gracePeriod=30 Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.697413 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="49ee7179-02d8-4e07-9bb8-fce22456e804" containerName="glance-httpd" containerID="cri-o://1d3f3c09a65ee0d96c5761cf37afd174b0dd657fc73b2eb650e087f0fb7b89e4" gracePeriod=30 Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.703100 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f","Type":"ContainerStarted","Data":"51fce251ae6f91898056967dc7598e9f180480171f8ac8dce0e6615b0fed1c2e"} Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.713926 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e6a8f674-82eb-4474-973d-54a90e5fd1e0","Type":"ContainerStarted","Data":"129f5e2081595e346ca05bf1c0a5f39318c542efe9f77225ecf83f06ff7156ab"} Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.727951 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-fdg68" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.728917 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-fdg68" event={"ID":"5e1948a5-46f6-412d-91c3-bf9c255e02fc","Type":"ContainerDied","Data":"2f209c4397604d7a0a9dbd934378982e63afb3fec9341073117892ee26e51dd6"} Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.728974 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f209c4397604d7a0a9dbd934378982e63afb3fec9341073117892ee26e51dd6" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.734989 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.734958993 podStartE2EDuration="7.734958993s" podCreationTimestamp="2026-03-09 14:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:09:47.725950413 +0000 UTC m=+2942.976122321" watchObservedRunningTime="2026-03-09 14:09:47.734958993 +0000 UTC m=+2942.985130901" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.757804 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=6.006113753 podStartE2EDuration="7.757780743s" podCreationTimestamp="2026-03-09 14:09:40 +0000 UTC" firstStartedPulling="2026-03-09 14:09:43.267317795 +0000 UTC m=+2938.517489703" lastFinishedPulling="2026-03-09 14:09:45.018984785 +0000 UTC m=+2940.269156693" observedRunningTime="2026-03-09 14:09:47.753214111 +0000 UTC m=+2943.003386039" watchObservedRunningTime="2026-03-09 14:09:47.757780743 +0000 UTC m=+2943.007952651" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.859192 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=5.288626447 podStartE2EDuration="6.85916377s" podCreationTimestamp="2026-03-09 14:09:41 +0000 UTC" firstStartedPulling="2026-03-09 14:09:43.898605171 +0000 UTC m=+2939.148777079" lastFinishedPulling="2026-03-09 14:09:45.469142494 +0000 UTC m=+2940.719314402" observedRunningTime="2026-03-09 14:09:47.801252973 +0000 UTC m=+2943.051424891" watchObservedRunningTime="2026-03-09 14:09:47.85916377 +0000 UTC m=+2943.109335678" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.953133 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.953099468 podStartE2EDuration="7.953099468s" podCreationTimestamp="2026-03-09 14:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:09:47.823708463 +0000 UTC m=+2943.073880381" watchObservedRunningTime="2026-03-09 14:09:47.953099468 +0000 UTC m=+2943.203271386" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.442503 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1df0-account-create-update-bg564" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.550980 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl52v\" (UniqueName: \"kubernetes.io/projected/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-kube-api-access-pl52v\") pod \"7ce9bce5-9c23-40ac-9683-6fb232e32c3c\" (UID: \"7ce9bce5-9c23-40ac-9683-6fb232e32c3c\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.551573 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-operator-scripts\") pod \"7ce9bce5-9c23-40ac-9683-6fb232e32c3c\" (UID: \"7ce9bce5-9c23-40ac-9683-6fb232e32c3c\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.554314 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ce9bce5-9c23-40ac-9683-6fb232e32c3c" (UID: "7ce9bce5-9c23-40ac-9683-6fb232e32c3c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.563617 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-kube-api-access-pl52v" (OuterVolumeSpecName: "kube-api-access-pl52v") pod "7ce9bce5-9c23-40ac-9683-6fb232e32c3c" (UID: "7ce9bce5-9c23-40ac-9683-6fb232e32c3c"). InnerVolumeSpecName "kube-api-access-pl52v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.566873 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-797d44c9b-wrlx7"] Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.577454 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56bb55c768-vchmw"] Mar 09 14:09:48 crc kubenswrapper[4764]: W0309 14:09:48.624215 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47ef29f6_4627_4b84_968d_db9d7ed438da.slice/crio-99ac8673ac7f481cb7c33e55443030ac6b7a346a12cfd65d6c94f0f8452e36e3 WatchSource:0}: Error finding container 99ac8673ac7f481cb7c33e55443030ac6b7a346a12cfd65d6c94f0f8452e36e3: Status 404 returned error can't find the container with id 99ac8673ac7f481cb7c33e55443030ac6b7a346a12cfd65d6c94f0f8452e36e3 Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.670507 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl52v\" (UniqueName: \"kubernetes.io/projected/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-kube-api-access-pl52v\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.675492 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.744423 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797d44c9b-wrlx7" event={"ID":"e00fc104-ec73-4190-a598-86de7ca6cfa5","Type":"ContainerStarted","Data":"183964e4224d34ffefb68047495776b87e734441e831507181f1e3aafc498e51"} Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.752347 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-1df0-account-create-update-bg564" event={"ID":"7ce9bce5-9c23-40ac-9683-6fb232e32c3c","Type":"ContainerDied","Data":"2002bb74ed38eb2c1f0e486f850b4e290555f1f76e89842042231f9e9946ffde"} Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.752465 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2002bb74ed38eb2c1f0e486f850b4e290555f1f76e89842042231f9e9946ffde" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.752538 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1df0-account-create-update-bg564" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.775193 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56bb55c768-vchmw" event={"ID":"47ef29f6-4627-4b84-968d-db9d7ed438da","Type":"ContainerStarted","Data":"99ac8673ac7f481cb7c33e55443030ac6b7a346a12cfd65d6c94f0f8452e36e3"} Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.788065 4764 generic.go:334] "Generic (PLEG): container finished" podID="349527e3-93e0-4342-845c-eb8775ab3e5a" containerID="ecd62f1b0eb5226209d3ecf6887c5a0f2d0feaa519397c875dd114ffc89d8b3b" exitCode=143 Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.788120 4764 generic.go:334] "Generic (PLEG): container finished" podID="349527e3-93e0-4342-845c-eb8775ab3e5a" containerID="30776e46d343f3c10f72753cb59ea338812b698d41259546e8c5f506b57f2e53" exitCode=143 Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.788255 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"349527e3-93e0-4342-845c-eb8775ab3e5a","Type":"ContainerDied","Data":"ecd62f1b0eb5226209d3ecf6887c5a0f2d0feaa519397c875dd114ffc89d8b3b"} Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.788297 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"349527e3-93e0-4342-845c-eb8775ab3e5a","Type":"ContainerDied","Data":"30776e46d343f3c10f72753cb59ea338812b698d41259546e8c5f506b57f2e53"} Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.788332 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"349527e3-93e0-4342-845c-eb8775ab3e5a","Type":"ContainerDied","Data":"6f8e9f0abe6ca5e7bff14c308c2db546c0b2c9b94e174f7fa80c547c31b16a26"} Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.788346 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f8e9f0abe6ca5e7bff14c308c2db546c0b2c9b94e174f7fa80c547c31b16a26" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.791829 4764 generic.go:334] "Generic (PLEG): container finished" podID="49ee7179-02d8-4e07-9bb8-fce22456e804" containerID="1d3f3c09a65ee0d96c5761cf37afd174b0dd657fc73b2eb650e087f0fb7b89e4" exitCode=143 Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.791863 4764 generic.go:334] "Generic (PLEG): container finished" podID="49ee7179-02d8-4e07-9bb8-fce22456e804" containerID="07b51e4be4c9ba108c01cded8c1c80a772fa736483e292b6667b962df1606942" exitCode=143 Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.792086 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"49ee7179-02d8-4e07-9bb8-fce22456e804","Type":"ContainerDied","Data":"1d3f3c09a65ee0d96c5761cf37afd174b0dd657fc73b2eb650e087f0fb7b89e4"} Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.792184 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"49ee7179-02d8-4e07-9bb8-fce22456e804","Type":"ContainerDied","Data":"07b51e4be4c9ba108c01cded8c1c80a772fa736483e292b6667b962df1606942"} Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.830522 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.879386 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-combined-ca-bundle\") pod \"349527e3-93e0-4342-845c-eb8775ab3e5a\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.881771 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-logs\") pod \"349527e3-93e0-4342-845c-eb8775ab3e5a\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.881832 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxdzv\" (UniqueName: \"kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-kube-api-access-kxdzv\") pod \"349527e3-93e0-4342-845c-eb8775ab3e5a\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.881894 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-ceph\") pod \"349527e3-93e0-4342-845c-eb8775ab3e5a\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.882060 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-internal-tls-certs\") pod \"349527e3-93e0-4342-845c-eb8775ab3e5a\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.882162 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-config-data\") pod \"349527e3-93e0-4342-845c-eb8775ab3e5a\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.882233 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"349527e3-93e0-4342-845c-eb8775ab3e5a\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.882286 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-httpd-run\") pod \"349527e3-93e0-4342-845c-eb8775ab3e5a\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.882373 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-scripts\") pod \"349527e3-93e0-4342-845c-eb8775ab3e5a\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.887760 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-logs" (OuterVolumeSpecName: "logs") pod "349527e3-93e0-4342-845c-eb8775ab3e5a" (UID: "349527e3-93e0-4342-845c-eb8775ab3e5a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.895350 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-scripts" (OuterVolumeSpecName: "scripts") pod "349527e3-93e0-4342-845c-eb8775ab3e5a" (UID: "349527e3-93e0-4342-845c-eb8775ab3e5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.895820 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-kube-api-access-kxdzv" (OuterVolumeSpecName: "kube-api-access-kxdzv") pod "349527e3-93e0-4342-845c-eb8775ab3e5a" (UID: "349527e3-93e0-4342-845c-eb8775ab3e5a"). InnerVolumeSpecName "kube-api-access-kxdzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.899154 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "349527e3-93e0-4342-845c-eb8775ab3e5a" (UID: "349527e3-93e0-4342-845c-eb8775ab3e5a"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.899984 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-ceph" (OuterVolumeSpecName: "ceph") pod "349527e3-93e0-4342-845c-eb8775ab3e5a" (UID: "349527e3-93e0-4342-845c-eb8775ab3e5a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.901004 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "349527e3-93e0-4342-845c-eb8775ab3e5a" (UID: "349527e3-93e0-4342-845c-eb8775ab3e5a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.930263 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "349527e3-93e0-4342-845c-eb8775ab3e5a" (UID: "349527e3-93e0-4342-845c-eb8775ab3e5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.950828 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "349527e3-93e0-4342-845c-eb8775ab3e5a" (UID: "349527e3-93e0-4342-845c-eb8775ab3e5a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.994690 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.994733 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.994763 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.994775 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.994785 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.994797 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.994820 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.994834 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxdzv\" (UniqueName: \"kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-kube-api-access-kxdzv\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:49 crc kubenswrapper[4764]: I0309 14:09:49.022663 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-config-data" (OuterVolumeSpecName: "config-data") pod "349527e3-93e0-4342-845c-eb8775ab3e5a" (UID: "349527e3-93e0-4342-845c-eb8775ab3e5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:49 crc kubenswrapper[4764]: I0309 14:09:49.025145 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 09 14:09:49 crc kubenswrapper[4764]: I0309 14:09:49.096933 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:49 crc kubenswrapper[4764]: I0309 14:09:49.096968 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:49 crc kubenswrapper[4764]: I0309 14:09:49.808106 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:09:49 crc kubenswrapper[4764]: I0309 14:09:49.954393 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:09:49 crc kubenswrapper[4764]: I0309 14:09:49.981181 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.005580 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.020237 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:09:50 crc kubenswrapper[4764]: E0309 14:09:50.020924 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ee7179-02d8-4e07-9bb8-fce22456e804" containerName="glance-httpd" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.020954 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ee7179-02d8-4e07-9bb8-fce22456e804" containerName="glance-httpd" Mar 09 14:09:50 crc kubenswrapper[4764]: E0309 14:09:50.020982 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349527e3-93e0-4342-845c-eb8775ab3e5a" containerName="glance-httpd" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.020991 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="349527e3-93e0-4342-845c-eb8775ab3e5a" containerName="glance-httpd" Mar 09 14:09:50 crc kubenswrapper[4764]: E0309 14:09:50.021019 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349527e3-93e0-4342-845c-eb8775ab3e5a" containerName="glance-log" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.021028 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="349527e3-93e0-4342-845c-eb8775ab3e5a" containerName="glance-log" Mar 09 14:09:50 crc kubenswrapper[4764]: E0309 14:09:50.021039 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ee7179-02d8-4e07-9bb8-fce22456e804" containerName="glance-log" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.021047 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ee7179-02d8-4e07-9bb8-fce22456e804" containerName="glance-log" Mar 09 14:09:50 crc kubenswrapper[4764]: E0309 14:09:50.021062 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ce9bce5-9c23-40ac-9683-6fb232e32c3c" containerName="mariadb-account-create-update" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.021071 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce9bce5-9c23-40ac-9683-6fb232e32c3c" containerName="mariadb-account-create-update" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.021288 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ee7179-02d8-4e07-9bb8-fce22456e804" containerName="glance-log" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.021312 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="349527e3-93e0-4342-845c-eb8775ab3e5a" containerName="glance-log" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.021324 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ce9bce5-9c23-40ac-9683-6fb232e32c3c" containerName="mariadb-account-create-update" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.021338 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ee7179-02d8-4e07-9bb8-fce22456e804" containerName="glance-httpd" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.021359 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="349527e3-93e0-4342-845c-eb8775ab3e5a" containerName="glance-httpd" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.022735 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.026226 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.026337 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.047857 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-httpd-run\") pod \"49ee7179-02d8-4e07-9bb8-fce22456e804\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.048117 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-config-data\") pod \"49ee7179-02d8-4e07-9bb8-fce22456e804\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.048175 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-public-tls-certs\") pod \"49ee7179-02d8-4e07-9bb8-fce22456e804\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.048228 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"49ee7179-02d8-4e07-9bb8-fce22456e804\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.049243 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "49ee7179-02d8-4e07-9bb8-fce22456e804" (UID: "49ee7179-02d8-4e07-9bb8-fce22456e804"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.049718 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-logs\") pod \"49ee7179-02d8-4e07-9bb8-fce22456e804\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.049791 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t465\" (UniqueName: \"kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-kube-api-access-4t465\") pod \"49ee7179-02d8-4e07-9bb8-fce22456e804\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.049825 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-scripts\") pod \"49ee7179-02d8-4e07-9bb8-fce22456e804\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.049890 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-ceph\") pod \"49ee7179-02d8-4e07-9bb8-fce22456e804\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.050051 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-combined-ca-bundle\") pod \"49ee7179-02d8-4e07-9bb8-fce22456e804\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.050189 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-logs" (OuterVolumeSpecName: "logs") pod "49ee7179-02d8-4e07-9bb8-fce22456e804" (UID: "49ee7179-02d8-4e07-9bb8-fce22456e804"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.051239 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.051261 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.058321 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-scripts" (OuterVolumeSpecName: "scripts") pod "49ee7179-02d8-4e07-9bb8-fce22456e804" (UID: "49ee7179-02d8-4e07-9bb8-fce22456e804"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.063753 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.069232 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-kube-api-access-4t465" (OuterVolumeSpecName: "kube-api-access-4t465") pod "49ee7179-02d8-4e07-9bb8-fce22456e804" (UID: "49ee7179-02d8-4e07-9bb8-fce22456e804"). InnerVolumeSpecName "kube-api-access-4t465". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.071243 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-ceph" (OuterVolumeSpecName: "ceph") pod "49ee7179-02d8-4e07-9bb8-fce22456e804" (UID: "49ee7179-02d8-4e07-9bb8-fce22456e804"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.076992 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "49ee7179-02d8-4e07-9bb8-fce22456e804" (UID: "49ee7179-02d8-4e07-9bb8-fce22456e804"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.118620 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "49ee7179-02d8-4e07-9bb8-fce22456e804" (UID: "49ee7179-02d8-4e07-9bb8-fce22456e804"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.123997 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49ee7179-02d8-4e07-9bb8-fce22456e804" (UID: "49ee7179-02d8-4e07-9bb8-fce22456e804"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.137884 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-config-data" (OuterVolumeSpecName: "config-data") pod "49ee7179-02d8-4e07-9bb8-fce22456e804" (UID: "49ee7179-02d8-4e07-9bb8-fce22456e804"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.152952 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-logs\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153049 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153159 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153185 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153217 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153244 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgvwl\" (UniqueName: \"kubernetes.io/projected/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-kube-api-access-pgvwl\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153289 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-ceph\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153326 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153348 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153419 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153434 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153460 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153473 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t465\" (UniqueName: \"kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-kube-api-access-4t465\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153483 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153493 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153502 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.178853 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.256078 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.256129 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.256164 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.256185 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgvwl\" (UniqueName: \"kubernetes.io/projected/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-kube-api-access-pgvwl\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.256736 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-ceph\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.256799 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.256826 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.256959 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-logs\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.257030 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.257094 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.258117 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.259738 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-logs\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.260252 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.266000 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.266183 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-ceph\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.267585 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.267625 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.270536 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.286767 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgvwl\" (UniqueName: \"kubernetes.io/projected/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-kube-api-access-pgvwl\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.333158 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.350163 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.823657 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"49ee7179-02d8-4e07-9bb8-fce22456e804","Type":"ContainerDied","Data":"1f84cee9a80ba50e44abce0cb213778729f5e3cb3d6103fdf6b0ebee436d5285"} Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.824350 4764 scope.go:117] "RemoveContainer" containerID="1d3f3c09a65ee0d96c5761cf37afd174b0dd657fc73b2eb650e087f0fb7b89e4" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.824528 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.879439 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.903560 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.914288 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.916619 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.920009 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.921184 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.930829 4764 scope.go:117] "RemoveContainer" containerID="07b51e4be4c9ba108c01cded8c1c80a772fa736483e292b6667b962df1606942" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.940075 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.982688 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz2wk\" (UniqueName: \"kubernetes.io/projected/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-kube-api-access-xz2wk\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.982891 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-logs\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.982925 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-ceph\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.982975 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-config-data\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.983015 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.983088 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.983262 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.983321 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-scripts\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.983403 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.096266 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.096345 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.096527 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.096568 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-scripts\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.096659 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.096785 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz2wk\" (UniqueName: \"kubernetes.io/projected/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-kube-api-access-xz2wk\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.096992 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-logs\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.097025 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-ceph\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.097047 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-config-data\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.098827 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-logs\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.099139 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.099670 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.102154 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.106638 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.109108 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.122534 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-ceph\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.127004 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-scripts\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.132432 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-config-data\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.132949 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz2wk\" (UniqueName: \"kubernetes.io/projected/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-kube-api-access-xz2wk\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.168029 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.261445 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.346679 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.592706 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="349527e3-93e0-4342-845c-eb8775ab3e5a" path="/var/lib/kubelet/pods/349527e3-93e0-4342-845c-eb8775ab3e5a/volumes" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.594714 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ee7179-02d8-4e07-9bb8-fce22456e804" path="/var/lib/kubelet/pods/49ee7179-02d8-4e07-9bb8-fce22456e804/volumes" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.869765 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"66d58a1b-5d94-4d28-bcb3-0b20f0516eab","Type":"ContainerStarted","Data":"04f4bc8a503c5f3a059e9920c27317855087791ac437284180c815115a1d77ab"} Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.933660 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:52 crc kubenswrapper[4764]: I0309 14:09:52.101930 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:09:52 crc kubenswrapper[4764]: I0309 14:09:52.591860 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Mar 09 14:09:52 crc kubenswrapper[4764]: I0309 14:09:52.892733 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"66d58a1b-5d94-4d28-bcb3-0b20f0516eab","Type":"ContainerStarted","Data":"5efde14e4961a0c01b0d30dad46664f7ec8bdb3a095cf8f867dc1096e67930fe"} Mar 09 14:09:52 crc kubenswrapper[4764]: I0309 14:09:52.898012 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"22563404-fb5a-4d95-bae1-dd24d6fcc8d1","Type":"ContainerStarted","Data":"945ab8a9c2f227bba0dd20eac0acb19c987f60d231f148f4ac3cb22c038524f8"} Mar 09 14:09:52 crc kubenswrapper[4764]: I0309 14:09:52.919496 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.649799 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-j6s45"] Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.651726 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.655917 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-692nx" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.677199 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-j6s45"] Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.677510 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.695424 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-job-config-data\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.695764 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svz5p\" (UniqueName: \"kubernetes.io/projected/6711cdff-410c-4d91-b172-c2065054c1be-kube-api-access-svz5p\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.696142 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-config-data\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.696472 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-combined-ca-bundle\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.799136 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-config-data\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.800501 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-combined-ca-bundle\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.800617 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-job-config-data\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.800702 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svz5p\" (UniqueName: \"kubernetes.io/projected/6711cdff-410c-4d91-b172-c2065054c1be-kube-api-access-svz5p\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.812155 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-combined-ca-bundle\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.812297 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-config-data\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.826479 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svz5p\" (UniqueName: \"kubernetes.io/projected/6711cdff-410c-4d91-b172-c2065054c1be-kube-api-access-svz5p\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.830188 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-job-config-data\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.920307 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"22563404-fb5a-4d95-bae1-dd24d6fcc8d1","Type":"ContainerStarted","Data":"1a1e2836425112167dc3a353e0bfab1f8e138d9c7e0337213050fc6ff682af19"} Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.924355 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"66d58a1b-5d94-4d28-bcb3-0b20f0516eab","Type":"ContainerStarted","Data":"bd4eb58d364d43b43f36b9f6d2633e35d7a9461a68f5aaca98bd22b9d851df9a"} Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.962391 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.962361898 podStartE2EDuration="4.962361898s" podCreationTimestamp="2026-03-09 14:09:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:09:53.950526402 +0000 UTC m=+2949.200698320" watchObservedRunningTime="2026-03-09 14:09:53.962361898 +0000 UTC m=+2949.212533816" Mar 09 14:09:54 crc kubenswrapper[4764]: I0309 14:09:54.006517 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:58 crc kubenswrapper[4764]: I0309 14:09:58.370800 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:09:58 crc kubenswrapper[4764]: I0309 14:09:58.371718 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:09:59 crc kubenswrapper[4764]: I0309 14:09:59.980757 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-j6s45"] Mar 09 14:09:59 crc kubenswrapper[4764]: W0309 14:09:59.998718 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6711cdff_410c_4d91_b172_c2065054c1be.slice/crio-feee9d84b5899b620a2823aa6ab377cd98db0a958fd8bd41a85078ce9d01392c WatchSource:0}: Error finding container feee9d84b5899b620a2823aa6ab377cd98db0a958fd8bd41a85078ce9d01392c: Status 404 returned error can't find the container with id feee9d84b5899b620a2823aa6ab377cd98db0a958fd8bd41a85078ce9d01392c Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.016722 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55f8b7fc4c-6rdd7" event={"ID":"6b944cf9-8278-4b16-b09c-0da6a2519b2a","Type":"ContainerStarted","Data":"1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1"} Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.018938 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56bb55c768-vchmw" event={"ID":"47ef29f6-4627-4b84-968d-db9d7ed438da","Type":"ContainerStarted","Data":"2cde99c478c4a636d0754a9a4064c4e5d79e6ed799daf627481c02df995be0c3"} Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.023541 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797d44c9b-wrlx7" event={"ID":"e00fc104-ec73-4190-a598-86de7ca6cfa5","Type":"ContainerStarted","Data":"ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96"} Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.026267 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9d94c4c7-px8jh" event={"ID":"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243","Type":"ContainerStarted","Data":"fabc9d27abd14d14db86a85f4413de77c2fd101b8722a207f502ad9ee60b4405"} Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.152279 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551090-8wdlp"] Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.154136 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551090-8wdlp" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.160084 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.161592 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.161774 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.164365 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551090-8wdlp"] Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.302954 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqk4g\" (UniqueName: \"kubernetes.io/projected/4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e-kube-api-access-nqk4g\") pod \"auto-csr-approver-29551090-8wdlp\" (UID: \"4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e\") " pod="openshift-infra/auto-csr-approver-29551090-8wdlp" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.351248 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.353699 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.386807 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.398919 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.405990 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqk4g\" (UniqueName: \"kubernetes.io/projected/4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e-kube-api-access-nqk4g\") pod \"auto-csr-approver-29551090-8wdlp\" (UID: \"4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e\") " pod="openshift-infra/auto-csr-approver-29551090-8wdlp" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.428388 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqk4g\" (UniqueName: \"kubernetes.io/projected/4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e-kube-api-access-nqk4g\") pod \"auto-csr-approver-29551090-8wdlp\" (UID: \"4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e\") " pod="openshift-infra/auto-csr-approver-29551090-8wdlp" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.474749 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551090-8wdlp" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.820764 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551090-8wdlp"] Mar 09 14:10:00 crc kubenswrapper[4764]: W0309 14:10:00.822794 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b56c1fe_b4de_45ba_8ca2_7bae98a2e97e.slice/crio-67b8f22e1dd658bc6fc7c70355b7dbf66bd2db713618122623ec70d154f78240 WatchSource:0}: Error finding container 67b8f22e1dd658bc6fc7c70355b7dbf66bd2db713618122623ec70d154f78240: Status 404 returned error can't find the container with id 67b8f22e1dd658bc6fc7c70355b7dbf66bd2db713618122623ec70d154f78240 Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.043144 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-j6s45" event={"ID":"6711cdff-410c-4d91-b172-c2065054c1be","Type":"ContainerStarted","Data":"feee9d84b5899b620a2823aa6ab377cd98db0a958fd8bd41a85078ce9d01392c"} Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.049330 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55f8b7fc4c-6rdd7" event={"ID":"6b944cf9-8278-4b16-b09c-0da6a2519b2a","Type":"ContainerStarted","Data":"b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd"} Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.049687 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-55f8b7fc4c-6rdd7" podUID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" containerName="horizon" containerID="cri-o://b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd" gracePeriod=30 Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.049627 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-55f8b7fc4c-6rdd7" podUID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" containerName="horizon-log" containerID="cri-o://1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1" gracePeriod=30 Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.064952 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56bb55c768-vchmw" event={"ID":"47ef29f6-4627-4b84-968d-db9d7ed438da","Type":"ContainerStarted","Data":"aaff32cc6e4babd2bdfeafec5fd5821c681fba782160c35443c5ff42d44cf549"} Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.067314 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551090-8wdlp" event={"ID":"4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e","Type":"ContainerStarted","Data":"67b8f22e1dd658bc6fc7c70355b7dbf66bd2db713618122623ec70d154f78240"} Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.069833 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"22563404-fb5a-4d95-bae1-dd24d6fcc8d1","Type":"ContainerStarted","Data":"42d1d334ebc93133ced56635a8f1065b36cde7d3488daab4ec92dcc8b0f826f6"} Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.073447 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797d44c9b-wrlx7" event={"ID":"e00fc104-ec73-4190-a598-86de7ca6cfa5","Type":"ContainerStarted","Data":"c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0"} Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.082352 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9d94c4c7-px8jh" podUID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" containerName="horizon-log" containerID="cri-o://fabc9d27abd14d14db86a85f4413de77c2fd101b8722a207f502ad9ee60b4405" gracePeriod=30 Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.082785 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9d94c4c7-px8jh" event={"ID":"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243","Type":"ContainerStarted","Data":"c5d8efe5920ca730c02900ccbfc9cdc3905256778569090d306fce8d9fecf051"} Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.082832 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.083392 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9d94c4c7-px8jh" podUID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" containerName="horizon" containerID="cri-o://c5d8efe5920ca730c02900ccbfc9cdc3905256778569090d306fce8d9fecf051" gracePeriod=30 Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.083559 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.085044 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-55f8b7fc4c-6rdd7" podStartSLOduration=3.460068893 podStartE2EDuration="18.085029656s" podCreationTimestamp="2026-03-09 14:09:43 +0000 UTC" firstStartedPulling="2026-03-09 14:09:44.817417983 +0000 UTC m=+2940.067589891" lastFinishedPulling="2026-03-09 14:09:59.442378746 +0000 UTC m=+2954.692550654" observedRunningTime="2026-03-09 14:10:01.082889519 +0000 UTC m=+2956.333061447" watchObservedRunningTime="2026-03-09 14:10:01.085029656 +0000 UTC m=+2956.335201554" Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.140247 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.14022114 podStartE2EDuration="11.14022114s" podCreationTimestamp="2026-03-09 14:09:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:10:01.128278311 +0000 UTC m=+2956.378450219" watchObservedRunningTime="2026-03-09 14:10:01.14022114 +0000 UTC m=+2956.390393048" Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.175942 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-56bb55c768-vchmw" podStartSLOduration=4.30296083 podStartE2EDuration="15.175910583s" podCreationTimestamp="2026-03-09 14:09:46 +0000 UTC" firstStartedPulling="2026-03-09 14:09:48.671023826 +0000 UTC m=+2943.921195914" lastFinishedPulling="2026-03-09 14:09:59.543973749 +0000 UTC m=+2954.794145667" observedRunningTime="2026-03-09 14:10:01.157256604 +0000 UTC m=+2956.407428512" watchObservedRunningTime="2026-03-09 14:10:01.175910583 +0000 UTC m=+2956.426082491" Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.201368 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-9d94c4c7-px8jh" podStartSLOduration=3.584317391 podStartE2EDuration="18.201329481s" podCreationTimestamp="2026-03-09 14:09:43 +0000 UTC" firstStartedPulling="2026-03-09 14:09:44.825236682 +0000 UTC m=+2940.075408590" lastFinishedPulling="2026-03-09 14:09:59.442248782 +0000 UTC m=+2954.692420680" observedRunningTime="2026-03-09 14:10:01.188136729 +0000 UTC m=+2956.438308657" watchObservedRunningTime="2026-03-09 14:10:01.201329481 +0000 UTC m=+2956.451501409" Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.218424 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-797d44c9b-wrlx7" podStartSLOduration=4.353830188 podStartE2EDuration="15.218392137s" podCreationTimestamp="2026-03-09 14:09:46 +0000 UTC" firstStartedPulling="2026-03-09 14:09:48.680010096 +0000 UTC m=+2943.930182004" lastFinishedPulling="2026-03-09 14:09:59.544572045 +0000 UTC m=+2954.794743953" observedRunningTime="2026-03-09 14:10:01.216341782 +0000 UTC m=+2956.466513700" watchObservedRunningTime="2026-03-09 14:10:01.218392137 +0000 UTC m=+2956.468564065" Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.262477 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.262534 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.306110 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.317590 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 14:10:02 crc kubenswrapper[4764]: I0309 14:10:02.089917 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 14:10:02 crc kubenswrapper[4764]: I0309 14:10:02.089987 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 14:10:03 crc kubenswrapper[4764]: I0309 14:10:03.123720 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551090-8wdlp" event={"ID":"4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e","Type":"ContainerStarted","Data":"c50fe2a841af25f201b2df55f17fb4354fb702de4d79adbf221b5ea06463110b"} Mar 09 14:10:03 crc kubenswrapper[4764]: I0309 14:10:03.160800 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551090-8wdlp" podStartSLOduration=1.9157215650000001 podStartE2EDuration="3.160771329s" podCreationTimestamp="2026-03-09 14:10:00 +0000 UTC" firstStartedPulling="2026-03-09 14:10:00.826622196 +0000 UTC m=+2956.076794104" lastFinishedPulling="2026-03-09 14:10:02.07167196 +0000 UTC m=+2957.321843868" observedRunningTime="2026-03-09 14:10:03.152836737 +0000 UTC m=+2958.403008645" watchObservedRunningTime="2026-03-09 14:10:03.160771329 +0000 UTC m=+2958.410943257" Mar 09 14:10:03 crc kubenswrapper[4764]: I0309 14:10:03.528792 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:10:03 crc kubenswrapper[4764]: I0309 14:10:03.834162 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:10:04 crc kubenswrapper[4764]: I0309 14:10:04.135928 4764 generic.go:334] "Generic (PLEG): container finished" podID="4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e" containerID="c50fe2a841af25f201b2df55f17fb4354fb702de4d79adbf221b5ea06463110b" exitCode=0 Mar 09 14:10:04 crc kubenswrapper[4764]: I0309 14:10:04.135986 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551090-8wdlp" event={"ID":"4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e","Type":"ContainerDied","Data":"c50fe2a841af25f201b2df55f17fb4354fb702de4d79adbf221b5ea06463110b"} Mar 09 14:10:05 crc kubenswrapper[4764]: I0309 14:10:05.321865 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 14:10:05 crc kubenswrapper[4764]: I0309 14:10:05.345683 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 14:10:05 crc kubenswrapper[4764]: I0309 14:10:05.346371 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 14:10:05 crc kubenswrapper[4764]: I0309 14:10:05.494954 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 14:10:07 crc kubenswrapper[4764]: I0309 14:10:07.276773 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:10:07 crc kubenswrapper[4764]: I0309 14:10:07.277316 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:10:07 crc kubenswrapper[4764]: I0309 14:10:07.580767 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551090-8wdlp" Mar 09 14:10:07 crc kubenswrapper[4764]: I0309 14:10:07.588445 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:10:07 crc kubenswrapper[4764]: I0309 14:10:07.588491 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:10:07 crc kubenswrapper[4764]: I0309 14:10:07.711479 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 14:10:07 crc kubenswrapper[4764]: I0309 14:10:07.722163 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqk4g\" (UniqueName: \"kubernetes.io/projected/4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e-kube-api-access-nqk4g\") pod \"4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e\" (UID: \"4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e\") " Mar 09 14:10:07 crc kubenswrapper[4764]: I0309 14:10:07.743127 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e-kube-api-access-nqk4g" (OuterVolumeSpecName: "kube-api-access-nqk4g") pod "4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e" (UID: "4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e"). InnerVolumeSpecName "kube-api-access-nqk4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:07 crc kubenswrapper[4764]: I0309 14:10:07.827387 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqk4g\" (UniqueName: \"kubernetes.io/projected/4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e-kube-api-access-nqk4g\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:08 crc kubenswrapper[4764]: I0309 14:10:08.199807 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551090-8wdlp" Mar 09 14:10:08 crc kubenswrapper[4764]: I0309 14:10:08.216718 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551090-8wdlp" event={"ID":"4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e","Type":"ContainerDied","Data":"67b8f22e1dd658bc6fc7c70355b7dbf66bd2db713618122623ec70d154f78240"} Mar 09 14:10:08 crc kubenswrapper[4764]: I0309 14:10:08.216810 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67b8f22e1dd658bc6fc7c70355b7dbf66bd2db713618122623ec70d154f78240" Mar 09 14:10:08 crc kubenswrapper[4764]: I0309 14:10:08.677665 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551084-pchq7"] Mar 09 14:10:08 crc kubenswrapper[4764]: I0309 14:10:08.689173 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551084-pchq7"] Mar 09 14:10:09 crc kubenswrapper[4764]: I0309 14:10:09.572583 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91d1d723-d4e8-40d8-9d17-3dfee51e7aef" path="/var/lib/kubelet/pods/91d1d723-d4e8-40d8-9d17-3dfee51e7aef/volumes" Mar 09 14:10:09 crc kubenswrapper[4764]: I0309 14:10:09.776083 4764 scope.go:117] "RemoveContainer" containerID="6e6b381c4b8297d66803da8d662836720642fff63afbcf8243cdcd4157213d11" Mar 09 14:10:10 crc kubenswrapper[4764]: I0309 14:10:10.221590 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-j6s45" event={"ID":"6711cdff-410c-4d91-b172-c2065054c1be","Type":"ContainerStarted","Data":"8d377192006e66551cb0ea6108fb1f8c6b81bac19c99e99d5e21dffabe9e931d"} Mar 09 14:10:10 crc kubenswrapper[4764]: I0309 14:10:10.248615 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-j6s45" podStartSLOduration=8.311472832 podStartE2EDuration="17.248585037s" podCreationTimestamp="2026-03-09 14:09:53 +0000 UTC" firstStartedPulling="2026-03-09 14:10:00.002203904 +0000 UTC m=+2955.252375812" lastFinishedPulling="2026-03-09 14:10:08.939316119 +0000 UTC m=+2964.189488017" observedRunningTime="2026-03-09 14:10:10.237229233 +0000 UTC m=+2965.487401151" watchObservedRunningTime="2026-03-09 14:10:10.248585037 +0000 UTC m=+2965.498756945" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.079711 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vks4j"] Mar 09 14:10:13 crc kubenswrapper[4764]: E0309 14:10:13.080900 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e" containerName="oc" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.080920 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e" containerName="oc" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.081142 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e" containerName="oc" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.085535 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.124625 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vks4j"] Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.174044 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-catalog-content\") pod \"redhat-marketplace-vks4j\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.174088 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-utilities\") pod \"redhat-marketplace-vks4j\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.174127 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52qrw\" (UniqueName: \"kubernetes.io/projected/f9c5571a-71f1-42d9-8025-2f51e13a5f03-kube-api-access-52qrw\") pod \"redhat-marketplace-vks4j\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.276882 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-catalog-content\") pod \"redhat-marketplace-vks4j\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.276951 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-utilities\") pod \"redhat-marketplace-vks4j\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.277004 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52qrw\" (UniqueName: \"kubernetes.io/projected/f9c5571a-71f1-42d9-8025-2f51e13a5f03-kube-api-access-52qrw\") pod \"redhat-marketplace-vks4j\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.277559 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-catalog-content\") pod \"redhat-marketplace-vks4j\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.277584 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-utilities\") pod \"redhat-marketplace-vks4j\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.310097 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52qrw\" (UniqueName: \"kubernetes.io/projected/f9c5571a-71f1-42d9-8025-2f51e13a5f03-kube-api-access-52qrw\") pod \"redhat-marketplace-vks4j\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.425617 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:14 crc kubenswrapper[4764]: I0309 14:10:14.016134 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vks4j"] Mar 09 14:10:14 crc kubenswrapper[4764]: I0309 14:10:14.273857 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vks4j" event={"ID":"f9c5571a-71f1-42d9-8025-2f51e13a5f03","Type":"ContainerStarted","Data":"c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8"} Mar 09 14:10:14 crc kubenswrapper[4764]: I0309 14:10:14.273909 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vks4j" event={"ID":"f9c5571a-71f1-42d9-8025-2f51e13a5f03","Type":"ContainerStarted","Data":"3870660a56f929c820c0d702e6958704a17b57192fa2a52ac26a15baa43c7554"} Mar 09 14:10:15 crc kubenswrapper[4764]: I0309 14:10:15.285218 4764 generic.go:334] "Generic (PLEG): container finished" podID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerID="c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8" exitCode=0 Mar 09 14:10:15 crc kubenswrapper[4764]: I0309 14:10:15.285299 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vks4j" event={"ID":"f9c5571a-71f1-42d9-8025-2f51e13a5f03","Type":"ContainerDied","Data":"c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8"} Mar 09 14:10:15 crc kubenswrapper[4764]: I0309 14:10:15.286039 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vks4j" event={"ID":"f9c5571a-71f1-42d9-8025-2f51e13a5f03","Type":"ContainerStarted","Data":"5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462"} Mar 09 14:10:16 crc kubenswrapper[4764]: I0309 14:10:16.299269 4764 generic.go:334] "Generic (PLEG): container finished" podID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerID="5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462" exitCode=0 Mar 09 14:10:16 crc kubenswrapper[4764]: I0309 14:10:16.299361 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vks4j" event={"ID":"f9c5571a-71f1-42d9-8025-2f51e13a5f03","Type":"ContainerDied","Data":"5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462"} Mar 09 14:10:17 crc kubenswrapper[4764]: I0309 14:10:17.283979 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-797d44c9b-wrlx7" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.5:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.5:8443: connect: connection refused" Mar 09 14:10:17 crc kubenswrapper[4764]: I0309 14:10:17.347851 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vks4j" event={"ID":"f9c5571a-71f1-42d9-8025-2f51e13a5f03","Type":"ContainerStarted","Data":"fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b"} Mar 09 14:10:17 crc kubenswrapper[4764]: I0309 14:10:17.393864 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vks4j" podStartSLOduration=1.927187277 podStartE2EDuration="4.393838498s" podCreationTimestamp="2026-03-09 14:10:13 +0000 UTC" firstStartedPulling="2026-03-09 14:10:14.277155161 +0000 UTC m=+2969.527327069" lastFinishedPulling="2026-03-09 14:10:16.743806382 +0000 UTC m=+2971.993978290" observedRunningTime="2026-03-09 14:10:17.385081384 +0000 UTC m=+2972.635253332" watchObservedRunningTime="2026-03-09 14:10:17.393838498 +0000 UTC m=+2972.644010406" Mar 09 14:10:17 crc kubenswrapper[4764]: I0309 14:10:17.595040 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-56bb55c768-vchmw" podUID="47ef29f6-4627-4b84-968d-db9d7ed438da" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.6:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.6:8443: connect: connection refused" Mar 09 14:10:21 crc kubenswrapper[4764]: I0309 14:10:21.392385 4764 generic.go:334] "Generic (PLEG): container finished" podID="6711cdff-410c-4d91-b172-c2065054c1be" containerID="8d377192006e66551cb0ea6108fb1f8c6b81bac19c99e99d5e21dffabe9e931d" exitCode=0 Mar 09 14:10:21 crc kubenswrapper[4764]: I0309 14:10:21.392466 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-j6s45" event={"ID":"6711cdff-410c-4d91-b172-c2065054c1be","Type":"ContainerDied","Data":"8d377192006e66551cb0ea6108fb1f8c6b81bac19c99e99d5e21dffabe9e931d"} Mar 09 14:10:22 crc kubenswrapper[4764]: I0309 14:10:22.904142 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-j6s45" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.060782 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-config-data\") pod \"6711cdff-410c-4d91-b172-c2065054c1be\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.060961 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svz5p\" (UniqueName: \"kubernetes.io/projected/6711cdff-410c-4d91-b172-c2065054c1be-kube-api-access-svz5p\") pod \"6711cdff-410c-4d91-b172-c2065054c1be\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.061022 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-job-config-data\") pod \"6711cdff-410c-4d91-b172-c2065054c1be\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.061161 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-combined-ca-bundle\") pod \"6711cdff-410c-4d91-b172-c2065054c1be\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.090179 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "6711cdff-410c-4d91-b172-c2065054c1be" (UID: "6711cdff-410c-4d91-b172-c2065054c1be"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.090323 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6711cdff-410c-4d91-b172-c2065054c1be-kube-api-access-svz5p" (OuterVolumeSpecName: "kube-api-access-svz5p") pod "6711cdff-410c-4d91-b172-c2065054c1be" (UID: "6711cdff-410c-4d91-b172-c2065054c1be"). InnerVolumeSpecName "kube-api-access-svz5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.103522 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-config-data" (OuterVolumeSpecName: "config-data") pod "6711cdff-410c-4d91-b172-c2065054c1be" (UID: "6711cdff-410c-4d91-b172-c2065054c1be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.110145 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6711cdff-410c-4d91-b172-c2065054c1be" (UID: "6711cdff-410c-4d91-b172-c2065054c1be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.164172 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.164232 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svz5p\" (UniqueName: \"kubernetes.io/projected/6711cdff-410c-4d91-b172-c2065054c1be-kube-api-access-svz5p\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.164247 4764 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-job-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.164257 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.426350 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.426407 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.450302 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-j6s45" event={"ID":"6711cdff-410c-4d91-b172-c2065054c1be","Type":"ContainerDied","Data":"feee9d84b5899b620a2823aa6ab377cd98db0a958fd8bd41a85078ce9d01392c"} Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.450357 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="feee9d84b5899b620a2823aa6ab377cd98db0a958fd8bd41a85078ce9d01392c" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.450433 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-j6s45" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.495369 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.549935 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.748008 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vks4j"] Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.779453 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 14:10:23 crc kubenswrapper[4764]: E0309 14:10:23.780033 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6711cdff-410c-4d91-b172-c2065054c1be" containerName="manila-db-sync" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.780048 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6711cdff-410c-4d91-b172-c2065054c1be" containerName="manila-db-sync" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.780281 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6711cdff-410c-4d91-b172-c2065054c1be" containerName="manila-db-sync" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.782523 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.789051 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.790252 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-692nx" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.790597 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.790853 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.815050 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.843148 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.845206 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.851145 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.857375 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.884254 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.884337 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.884364 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-scripts\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.884395 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.884427 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.884481 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq2r9\" (UniqueName: \"kubernetes.io/projected/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-kube-api-access-lq2r9\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.963516 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69655fd4bf-qlzqn"] Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.965320 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987036 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-ceph\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987107 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987168 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987203 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987237 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987256 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-scripts\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987278 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxzkm\" (UniqueName: \"kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-kube-api-access-wxzkm\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987304 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987340 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987357 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987407 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987439 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq2r9\" (UniqueName: \"kubernetes.io/projected/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-kube-api-access-lq2r9\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987468 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-scripts\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987517 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987815 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.995780 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:23.999412 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.008856 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69655fd4bf-qlzqn"] Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.030291 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-scripts\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.050535 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.051152 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq2r9\" (UniqueName: \"kubernetes.io/projected/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-kube-api-access-lq2r9\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092014 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092081 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxzkm\" (UniqueName: \"kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-kube-api-access-wxzkm\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092111 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-config\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092148 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092187 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092213 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-openstack-edpm-ipam\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092239 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-scripts\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092270 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pnsh\" (UniqueName: \"kubernetes.io/projected/1552c7db-c992-4b43-8f1e-2b752d718f36-kube-api-access-4pnsh\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092295 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-ovsdbserver-sb\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092310 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092355 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-ovsdbserver-nb\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092411 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-ceph\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092440 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092462 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-dns-svc\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092922 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.093413 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.104843 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.104942 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.106584 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.114162 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.115886 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.117223 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.117892 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-scripts\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.120090 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.129358 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxzkm\" (UniqueName: \"kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-kube-api-access-wxzkm\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.134627 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-ceph\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.153128 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.179127 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.198906 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data-custom\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.199544 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.199718 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b1ec63a-2933-4ca0-b695-d491eff9b77a-logs\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.199792 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-scripts\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.199843 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-openstack-edpm-ipam\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.199940 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pnsh\" (UniqueName: \"kubernetes.io/projected/1552c7db-c992-4b43-8f1e-2b752d718f36-kube-api-access-4pnsh\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.199971 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b1ec63a-2933-4ca0-b695-d491eff9b77a-etc-machine-id\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.199996 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-ovsdbserver-sb\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.200028 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.200123 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-ovsdbserver-nb\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.201262 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-openstack-edpm-ipam\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.201848 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-dns-svc\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.201991 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-config\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.202024 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59kpm\" (UniqueName: \"kubernetes.io/projected/6b1ec63a-2933-4ca0-b695-d491eff9b77a-kube-api-access-59kpm\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.202170 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-ovsdbserver-sb\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.203003 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-dns-svc\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.203766 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-ovsdbserver-nb\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.208786 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-config\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.235956 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pnsh\" (UniqueName: \"kubernetes.io/projected/1552c7db-c992-4b43-8f1e-2b752d718f36-kube-api-access-4pnsh\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.307451 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b1ec63a-2933-4ca0-b695-d491eff9b77a-etc-machine-id\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.307517 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.307711 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59kpm\" (UniqueName: \"kubernetes.io/projected/6b1ec63a-2933-4ca0-b695-d491eff9b77a-kube-api-access-59kpm\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.307724 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b1ec63a-2933-4ca0-b695-d491eff9b77a-etc-machine-id\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.307744 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data-custom\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.307839 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.307995 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b1ec63a-2933-4ca0-b695-d491eff9b77a-logs\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.308018 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-scripts\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.310086 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b1ec63a-2933-4ca0-b695-d491eff9b77a-logs\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.317263 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-scripts\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.317544 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.323614 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.329905 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data-custom\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.337413 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.352295 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59kpm\" (UniqueName: \"kubernetes.io/projected/6b1ec63a-2933-4ca0-b695-d491eff9b77a-kube-api-access-59kpm\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.353412 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.832942 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 14:10:25 crc kubenswrapper[4764]: I0309 14:10:25.024513 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 14:10:25 crc kubenswrapper[4764]: I0309 14:10:25.231451 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69655fd4bf-qlzqn"] Mar 09 14:10:25 crc kubenswrapper[4764]: I0309 14:10:25.361180 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 09 14:10:25 crc kubenswrapper[4764]: W0309 14:10:25.450270 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b1ec63a_2933_4ca0_b695_d491eff9b77a.slice/crio-9f8093a4057728ba1ffbcdd3c65ef3993ffdc953d49b97b14351ed827aa851fa WatchSource:0}: Error finding container 9f8093a4057728ba1ffbcdd3c65ef3993ffdc953d49b97b14351ed827aa851fa: Status 404 returned error can't find the container with id 9f8093a4057728ba1ffbcdd3c65ef3993ffdc953d49b97b14351ed827aa851fa Mar 09 14:10:25 crc kubenswrapper[4764]: I0309 14:10:25.647727 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vks4j" podUID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerName="registry-server" containerID="cri-o://fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b" gracePeriod=2 Mar 09 14:10:25 crc kubenswrapper[4764]: I0309 14:10:25.656863 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" event={"ID":"1552c7db-c992-4b43-8f1e-2b752d718f36","Type":"ContainerStarted","Data":"6bd1eef56b70234d3056028f3c04208e36deaced74a2fb80c47865a011b7fdf5"} Mar 09 14:10:25 crc kubenswrapper[4764]: I0309 14:10:25.656998 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"604c4e24-15e4-43e1-b08c-74d8337a2e71","Type":"ContainerStarted","Data":"99bc2f9e9539dc58eaf52226466b27a3c1c886f9a53c573b32136461c86d2a63"} Mar 09 14:10:25 crc kubenswrapper[4764]: I0309 14:10:25.657019 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6b1ec63a-2933-4ca0-b695-d491eff9b77a","Type":"ContainerStarted","Data":"9f8093a4057728ba1ffbcdd3c65ef3993ffdc953d49b97b14351ed827aa851fa"} Mar 09 14:10:25 crc kubenswrapper[4764]: I0309 14:10:25.657038 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"30f42bb2-6e26-4cbb-942b-a7d4ede4f128","Type":"ContainerStarted","Data":"69e9c4821e7bd2b8ea5bd5a2a4ce9a1fe30fca8c9b901cad13802f0443f55e38"} Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.323194 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.404378 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-catalog-content\") pod \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.404514 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52qrw\" (UniqueName: \"kubernetes.io/projected/f9c5571a-71f1-42d9-8025-2f51e13a5f03-kube-api-access-52qrw\") pod \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.404748 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-utilities\") pod \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.406137 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-utilities" (OuterVolumeSpecName: "utilities") pod "f9c5571a-71f1-42d9-8025-2f51e13a5f03" (UID: "f9c5571a-71f1-42d9-8025-2f51e13a5f03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.441380 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9c5571a-71f1-42d9-8025-2f51e13a5f03-kube-api-access-52qrw" (OuterVolumeSpecName: "kube-api-access-52qrw") pod "f9c5571a-71f1-42d9-8025-2f51e13a5f03" (UID: "f9c5571a-71f1-42d9-8025-2f51e13a5f03"). InnerVolumeSpecName "kube-api-access-52qrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.498443 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9c5571a-71f1-42d9-8025-2f51e13a5f03" (UID: "f9c5571a-71f1-42d9-8025-2f51e13a5f03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.508014 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.508061 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52qrw\" (UniqueName: \"kubernetes.io/projected/f9c5571a-71f1-42d9-8025-2f51e13a5f03-kube-api-access-52qrw\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.508076 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.672697 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6b1ec63a-2933-4ca0-b695-d491eff9b77a","Type":"ContainerStarted","Data":"13ac0cacb1783ca93930b2f289e68734abb6dbe01c5928e15fc30410946d427b"} Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.683162 4764 generic.go:334] "Generic (PLEG): container finished" podID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerID="fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b" exitCode=0 Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.683263 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vks4j" event={"ID":"f9c5571a-71f1-42d9-8025-2f51e13a5f03","Type":"ContainerDied","Data":"fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b"} Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.683308 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vks4j" event={"ID":"f9c5571a-71f1-42d9-8025-2f51e13a5f03","Type":"ContainerDied","Data":"3870660a56f929c820c0d702e6958704a17b57192fa2a52ac26a15baa43c7554"} Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.683333 4764 scope.go:117] "RemoveContainer" containerID="fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.683551 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.697065 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"30f42bb2-6e26-4cbb-942b-a7d4ede4f128","Type":"ContainerStarted","Data":"b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7"} Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.704693 4764 generic.go:334] "Generic (PLEG): container finished" podID="1552c7db-c992-4b43-8f1e-2b752d718f36" containerID="69acb3fb37ca44376fdac26f2a551a3d9baa29888217269c37f66f9e543d7d8b" exitCode=0 Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.704749 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" event={"ID":"1552c7db-c992-4b43-8f1e-2b752d718f36","Type":"ContainerDied","Data":"69acb3fb37ca44376fdac26f2a551a3d9baa29888217269c37f66f9e543d7d8b"} Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.743163 4764 scope.go:117] "RemoveContainer" containerID="5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.769353 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vks4j"] Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.785731 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vks4j"] Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.900695 4764 scope.go:117] "RemoveContainer" containerID="c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.961444 4764 scope.go:117] "RemoveContainer" containerID="fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b" Mar 09 14:10:26 crc kubenswrapper[4764]: E0309 14:10:26.962226 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b\": container with ID starting with fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b not found: ID does not exist" containerID="fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.962298 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b"} err="failed to get container status \"fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b\": rpc error: code = NotFound desc = could not find container \"fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b\": container with ID starting with fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b not found: ID does not exist" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.962331 4764 scope.go:117] "RemoveContainer" containerID="5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462" Mar 09 14:10:26 crc kubenswrapper[4764]: E0309 14:10:26.962895 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462\": container with ID starting with 5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462 not found: ID does not exist" containerID="5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.962946 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462"} err="failed to get container status \"5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462\": rpc error: code = NotFound desc = could not find container \"5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462\": container with ID starting with 5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462 not found: ID does not exist" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.962984 4764 scope.go:117] "RemoveContainer" containerID="c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8" Mar 09 14:10:26 crc kubenswrapper[4764]: E0309 14:10:26.963536 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8\": container with ID starting with c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8 not found: ID does not exist" containerID="c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.963588 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8"} err="failed to get container status \"c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8\": rpc error: code = NotFound desc = could not find container \"c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8\": container with ID starting with c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8 not found: ID does not exist" Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.264453 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.576161 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" path="/var/lib/kubelet/pods/f9c5571a-71f1-42d9-8025-2f51e13a5f03/volumes" Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.723100 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6b1ec63a-2933-4ca0-b695-d491eff9b77a","Type":"ContainerStarted","Data":"5233e0d8d749134147c23cffe783f6754b3b1078a5357cfc1ad88930b4c34b07"} Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.723260 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" containerName="manila-api-log" containerID="cri-o://13ac0cacb1783ca93930b2f289e68734abb6dbe01c5928e15fc30410946d427b" gracePeriod=30 Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.723473 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" containerName="manila-api" containerID="cri-o://5233e0d8d749134147c23cffe783f6754b3b1078a5357cfc1ad88930b4c34b07" gracePeriod=30 Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.723795 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.731905 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"30f42bb2-6e26-4cbb-942b-a7d4ede4f128","Type":"ContainerStarted","Data":"cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9"} Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.743973 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" event={"ID":"1552c7db-c992-4b43-8f1e-2b752d718f36","Type":"ContainerStarted","Data":"c90f8caec5bec2076fee7dbcaea582166bddda9aa7d92ec51426ab2db7e8b3a2"} Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.744227 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.758014 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.757979625 podStartE2EDuration="3.757979625s" podCreationTimestamp="2026-03-09 14:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:10:27.74880048 +0000 UTC m=+2982.998972398" watchObservedRunningTime="2026-03-09 14:10:27.757979625 +0000 UTC m=+2983.008151533" Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.798015 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" podStartSLOduration=4.797975033 podStartE2EDuration="4.797975033s" podCreationTimestamp="2026-03-09 14:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:10:27.791542611 +0000 UTC m=+2983.041714529" watchObservedRunningTime="2026-03-09 14:10:27.797975033 +0000 UTC m=+2983.048146941" Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.830765 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.131416666 podStartE2EDuration="4.830730828s" podCreationTimestamp="2026-03-09 14:10:23 +0000 UTC" firstStartedPulling="2026-03-09 14:10:24.816075195 +0000 UTC m=+2980.066247103" lastFinishedPulling="2026-03-09 14:10:25.515389357 +0000 UTC m=+2980.765561265" observedRunningTime="2026-03-09 14:10:27.820832754 +0000 UTC m=+2983.071004662" watchObservedRunningTime="2026-03-09 14:10:27.830730828 +0000 UTC m=+2983.080902736" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.370247 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.370609 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.370735 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.371705 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.371756 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" gracePeriod=600 Mar 09 14:10:28 crc kubenswrapper[4764]: E0309 14:10:28.536289 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.771369 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" exitCode=0 Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.771848 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb"} Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.771893 4764 scope.go:117] "RemoveContainer" containerID="63776e57dad14b6a74a94432519b0b437cfc2082448e28e0a32706fa439569a0" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.772938 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:10:28 crc kubenswrapper[4764]: E0309 14:10:28.773228 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.781495 4764 generic.go:334] "Generic (PLEG): container finished" podID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" containerID="5233e0d8d749134147c23cffe783f6754b3b1078a5357cfc1ad88930b4c34b07" exitCode=0 Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.781537 4764 generic.go:334] "Generic (PLEG): container finished" podID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" containerID="13ac0cacb1783ca93930b2f289e68734abb6dbe01c5928e15fc30410946d427b" exitCode=143 Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.781591 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6b1ec63a-2933-4ca0-b695-d491eff9b77a","Type":"ContainerDied","Data":"5233e0d8d749134147c23cffe783f6754b3b1078a5357cfc1ad88930b4c34b07"} Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.781677 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6b1ec63a-2933-4ca0-b695-d491eff9b77a","Type":"ContainerDied","Data":"13ac0cacb1783ca93930b2f289e68734abb6dbe01c5928e15fc30410946d427b"} Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.874787 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.914689 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b1ec63a-2933-4ca0-b695-d491eff9b77a-etc-machine-id\") pod \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.914787 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b1ec63a-2933-4ca0-b695-d491eff9b77a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6b1ec63a-2933-4ca0-b695-d491eff9b77a" (UID: "6b1ec63a-2933-4ca0-b695-d491eff9b77a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.914975 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b1ec63a-2933-4ca0-b695-d491eff9b77a-logs\") pod \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.915612 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b1ec63a-2933-4ca0-b695-d491eff9b77a-logs" (OuterVolumeSpecName: "logs") pod "6b1ec63a-2933-4ca0-b695-d491eff9b77a" (UID: "6b1ec63a-2933-4ca0-b695-d491eff9b77a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.915732 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data-custom\") pod \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.917265 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59kpm\" (UniqueName: \"kubernetes.io/projected/6b1ec63a-2933-4ca0-b695-d491eff9b77a-kube-api-access-59kpm\") pod \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.917438 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-combined-ca-bundle\") pod \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.917537 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-scripts\") pod \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.917822 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data\") pod \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.920837 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b1ec63a-2933-4ca0-b695-d491eff9b77a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.921075 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b1ec63a-2933-4ca0-b695-d491eff9b77a-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.929348 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b1ec63a-2933-4ca0-b695-d491eff9b77a-kube-api-access-59kpm" (OuterVolumeSpecName: "kube-api-access-59kpm") pod "6b1ec63a-2933-4ca0-b695-d491eff9b77a" (UID: "6b1ec63a-2933-4ca0-b695-d491eff9b77a"). InnerVolumeSpecName "kube-api-access-59kpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.940616 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-scripts" (OuterVolumeSpecName: "scripts") pod "6b1ec63a-2933-4ca0-b695-d491eff9b77a" (UID: "6b1ec63a-2933-4ca0-b695-d491eff9b77a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.941871 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6b1ec63a-2933-4ca0-b695-d491eff9b77a" (UID: "6b1ec63a-2933-4ca0-b695-d491eff9b77a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.979030 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b1ec63a-2933-4ca0-b695-d491eff9b77a" (UID: "6b1ec63a-2933-4ca0-b695-d491eff9b77a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.023536 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.023585 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59kpm\" (UniqueName: \"kubernetes.io/projected/6b1ec63a-2933-4ca0-b695-d491eff9b77a-kube-api-access-59kpm\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.023610 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.023673 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.037474 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data" (OuterVolumeSpecName: "config-data") pod "6b1ec63a-2933-4ca0-b695-d491eff9b77a" (UID: "6b1ec63a-2933-4ca0-b695-d491eff9b77a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.126499 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.797893 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6b1ec63a-2933-4ca0-b695-d491eff9b77a","Type":"ContainerDied","Data":"9f8093a4057728ba1ffbcdd3c65ef3993ffdc953d49b97b14351ed827aa851fa"} Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.797953 4764 scope.go:117] "RemoveContainer" containerID="5233e0d8d749134147c23cffe783f6754b3b1078a5357cfc1ad88930b4c34b07" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.797979 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.830003 4764 scope.go:117] "RemoveContainer" containerID="13ac0cacb1783ca93930b2f289e68734abb6dbe01c5928e15fc30410946d427b" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.831759 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.846774 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.874614 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 09 14:10:29 crc kubenswrapper[4764]: E0309 14:10:29.875267 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerName="extract-content" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.875288 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerName="extract-content" Mar 09 14:10:29 crc kubenswrapper[4764]: E0309 14:10:29.875331 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" containerName="manila-api" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.875338 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" containerName="manila-api" Mar 09 14:10:29 crc kubenswrapper[4764]: E0309 14:10:29.875364 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerName="extract-utilities" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.875375 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerName="extract-utilities" Mar 09 14:10:29 crc kubenswrapper[4764]: E0309 14:10:29.875384 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerName="registry-server" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.875391 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerName="registry-server" Mar 09 14:10:29 crc kubenswrapper[4764]: E0309 14:10:29.875402 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" containerName="manila-api-log" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.875409 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" containerName="manila-api-log" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.875619 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerName="registry-server" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.875672 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" containerName="manila-api-log" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.875695 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" containerName="manila-api" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.876970 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.882218 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.882543 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.882846 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.912776 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.952424 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-config-data-custom\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.952503 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tlj9\" (UniqueName: \"kubernetes.io/projected/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-kube-api-access-6tlj9\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.952557 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.952582 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-internal-tls-certs\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.952613 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-config-data\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.952711 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-scripts\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.952748 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-logs\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.954038 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-public-tls-certs\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.954128 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-etc-machine-id\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.058793 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-config-data-custom\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.058864 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tlj9\" (UniqueName: \"kubernetes.io/projected/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-kube-api-access-6tlj9\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.059804 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.059840 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-internal-tls-certs\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.059871 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-config-data\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.060818 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-scripts\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.060854 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-logs\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.060979 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-public-tls-certs\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.061025 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-etc-machine-id\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.061263 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-etc-machine-id\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.062213 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-logs\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.066128 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-config-data-custom\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.066925 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-config-data\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.067481 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.081355 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-scripts\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.081476 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-internal-tls-certs\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.081747 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-public-tls-certs\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.085676 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tlj9\" (UniqueName: \"kubernetes.io/projected/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-kube-api-access-6tlj9\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.218980 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.803736 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.804344 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="ceilometer-central-agent" containerID="cri-o://ff9bf0b1ebb34e175d58582b99ddb3f1097e979301d7cd95e80e6e75808835d6" gracePeriod=30 Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.804479 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="proxy-httpd" containerID="cri-o://5ee7ad3de1f9928c6b59a6676d88b6e84ba3c1fa18e74865387fac5e3b0fa60c" gracePeriod=30 Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.804514 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="sg-core" containerID="cri-o://c44f22885081c03e82b6817f6378545ad23fd40c3fd48004324856781c5d89d8" gracePeriod=30 Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.804550 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="ceilometer-notification-agent" containerID="cri-o://cb0403e3d98f6b11f541bb3ed05951b3aa36eed0ed52f4863be171ce5eaf63e2" gracePeriod=30 Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.884169 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.884831 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.917003 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.606090 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" path="/var/lib/kubelet/pods/6b1ec63a-2933-4ca0-b695-d491eff9b77a/volumes" Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.930305 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.965147 4764 generic.go:334] "Generic (PLEG): container finished" podID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerID="5ee7ad3de1f9928c6b59a6676d88b6e84ba3c1fa18e74865387fac5e3b0fa60c" exitCode=0 Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.965551 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7220ea42-daf2-4c41-85c9-0d2bda6d24eb","Type":"ContainerDied","Data":"5ee7ad3de1f9928c6b59a6676d88b6e84ba3c1fa18e74865387fac5e3b0fa60c"} Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.965708 4764 generic.go:334] "Generic (PLEG): container finished" podID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerID="c44f22885081c03e82b6817f6378545ad23fd40c3fd48004324856781c5d89d8" exitCode=2 Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.965724 4764 generic.go:334] "Generic (PLEG): container finished" podID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerID="cb0403e3d98f6b11f541bb3ed05951b3aa36eed0ed52f4863be171ce5eaf63e2" exitCode=0 Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.965735 4764 generic.go:334] "Generic (PLEG): container finished" podID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerID="ff9bf0b1ebb34e175d58582b99ddb3f1097e979301d7cd95e80e6e75808835d6" exitCode=0 Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.965904 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7220ea42-daf2-4c41-85c9-0d2bda6d24eb","Type":"ContainerDied","Data":"c44f22885081c03e82b6817f6378545ad23fd40c3fd48004324856781c5d89d8"} Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.965928 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7220ea42-daf2-4c41-85c9-0d2bda6d24eb","Type":"ContainerDied","Data":"cb0403e3d98f6b11f541bb3ed05951b3aa36eed0ed52f4863be171ce5eaf63e2"} Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.965947 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7220ea42-daf2-4c41-85c9-0d2bda6d24eb","Type":"ContainerDied","Data":"ff9bf0b1ebb34e175d58582b99ddb3f1097e979301d7cd95e80e6e75808835d6"} Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.982945 4764 generic.go:334] "Generic (PLEG): container finished" podID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" containerID="c5d8efe5920ca730c02900ccbfc9cdc3905256778569090d306fce8d9fecf051" exitCode=137 Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.982984 4764 generic.go:334] "Generic (PLEG): container finished" podID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" containerID="fabc9d27abd14d14db86a85f4413de77c2fd101b8722a207f502ad9ee60b4405" exitCode=137 Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.983035 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9d94c4c7-px8jh" event={"ID":"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243","Type":"ContainerDied","Data":"c5d8efe5920ca730c02900ccbfc9cdc3905256778569090d306fce8d9fecf051"} Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.983067 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9d94c4c7-px8jh" event={"ID":"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243","Type":"ContainerDied","Data":"fabc9d27abd14d14db86a85f4413de77c2fd101b8722a207f502ad9ee60b4405"} Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.994491 4764 generic.go:334] "Generic (PLEG): container finished" podID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" containerID="b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd" exitCode=137 Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.994530 4764 generic.go:334] "Generic (PLEG): container finished" podID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" containerID="1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1" exitCode=137 Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.994597 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55f8b7fc4c-6rdd7" event={"ID":"6b944cf9-8278-4b16-b09c-0da6a2519b2a","Type":"ContainerDied","Data":"b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd"} Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.994627 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55f8b7fc4c-6rdd7" event={"ID":"6b944cf9-8278-4b16-b09c-0da6a2519b2a","Type":"ContainerDied","Data":"1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1"} Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.994636 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55f8b7fc4c-6rdd7" event={"ID":"6b944cf9-8278-4b16-b09c-0da6a2519b2a","Type":"ContainerDied","Data":"03eb026db669a1bbc8569cf6227719fa0346433ca9e38091ef450a1f5669648c"} Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.994672 4764 scope.go:117] "RemoveContainer" containerID="b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd" Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.994761 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.005559 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d","Type":"ContainerStarted","Data":"970d9fd4daa3fbe40f5c9a6a8b41d61d6de1c7fac4538a6256c7207ce935e71b"} Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.005613 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d","Type":"ContainerStarted","Data":"fcf3d8e0199dcae7fca5124073c5e5842dc32d90dcd90cf221f911ec93541560"} Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.021630 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-scripts\") pod \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.021744 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlm9t\" (UniqueName: \"kubernetes.io/projected/6b944cf9-8278-4b16-b09c-0da6a2519b2a-kube-api-access-nlm9t\") pod \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.021906 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-config-data\") pod \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.022008 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b944cf9-8278-4b16-b09c-0da6a2519b2a-logs\") pod \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.022055 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b944cf9-8278-4b16-b09c-0da6a2519b2a-horizon-secret-key\") pod \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.024175 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b944cf9-8278-4b16-b09c-0da6a2519b2a-logs" (OuterVolumeSpecName: "logs") pod "6b944cf9-8278-4b16-b09c-0da6a2519b2a" (UID: "6b944cf9-8278-4b16-b09c-0da6a2519b2a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.025106 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.029288 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b944cf9-8278-4b16-b09c-0da6a2519b2a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6b944cf9-8278-4b16-b09c-0da6a2519b2a" (UID: "6b944cf9-8278-4b16-b09c-0da6a2519b2a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.035624 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b944cf9-8278-4b16-b09c-0da6a2519b2a-kube-api-access-nlm9t" (OuterVolumeSpecName: "kube-api-access-nlm9t") pod "6b944cf9-8278-4b16-b09c-0da6a2519b2a" (UID: "6b944cf9-8278-4b16-b09c-0da6a2519b2a"). InnerVolumeSpecName "kube-api-access-nlm9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.117538 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-config-data" (OuterVolumeSpecName: "config-data") pod "6b944cf9-8278-4b16-b09c-0da6a2519b2a" (UID: "6b944cf9-8278-4b16-b09c-0da6a2519b2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.117474 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-scripts" (OuterVolumeSpecName: "scripts") pod "6b944cf9-8278-4b16-b09c-0da6a2519b2a" (UID: "6b944cf9-8278-4b16-b09c-0da6a2519b2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.124458 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-scripts\") pod \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.124609 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-horizon-secret-key\") pod \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.124726 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktc9m\" (UniqueName: \"kubernetes.io/projected/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-kube-api-access-ktc9m\") pod \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.124799 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-logs\") pod \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.124855 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-config-data\") pod \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.125974 4764 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b944cf9-8278-4b16-b09c-0da6a2519b2a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.125995 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.126010 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlm9t\" (UniqueName: \"kubernetes.io/projected/6b944cf9-8278-4b16-b09c-0da6a2519b2a-kube-api-access-nlm9t\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.126026 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.126038 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b944cf9-8278-4b16-b09c-0da6a2519b2a-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.130425 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-logs" (OuterVolumeSpecName: "logs") pod "849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" (UID: "849dc0e6-d7a3-4745-8cc0-7b7af3e4d243"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.134432 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-kube-api-access-ktc9m" (OuterVolumeSpecName: "kube-api-access-ktc9m") pod "849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" (UID: "849dc0e6-d7a3-4745-8cc0-7b7af3e4d243"). InnerVolumeSpecName "kube-api-access-ktc9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.137526 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" (UID: "849dc0e6-d7a3-4745-8cc0-7b7af3e4d243"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.151978 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-scripts" (OuterVolumeSpecName: "scripts") pod "849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" (UID: "849dc0e6-d7a3-4745-8cc0-7b7af3e4d243"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.168885 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-config-data" (OuterVolumeSpecName: "config-data") pod "849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" (UID: "849dc0e6-d7a3-4745-8cc0-7b7af3e4d243"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.228494 4764 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.228541 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktc9m\" (UniqueName: \"kubernetes.io/projected/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-kube-api-access-ktc9m\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.228556 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.228569 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.228581 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.306512 4764 scope.go:117] "RemoveContainer" containerID="1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.488300 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.498582 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55f8b7fc4c-6rdd7"] Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.527335 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-55f8b7fc4c-6rdd7"] Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.537533 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-sg-core-conf-yaml\") pod \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.537599 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-combined-ca-bundle\") pod \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.537689 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-run-httpd\") pod \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.537723 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-scripts\") pod \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.537812 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-log-httpd\") pod \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.537853 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-config-data\") pod \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.537888 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-ceilometer-tls-certs\") pod \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.538112 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxjf8\" (UniqueName: \"kubernetes.io/projected/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-kube-api-access-kxjf8\") pod \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.540235 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7220ea42-daf2-4c41-85c9-0d2bda6d24eb" (UID: "7220ea42-daf2-4c41-85c9-0d2bda6d24eb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.540336 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7220ea42-daf2-4c41-85c9-0d2bda6d24eb" (UID: "7220ea42-daf2-4c41-85c9-0d2bda6d24eb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.547518 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-scripts" (OuterVolumeSpecName: "scripts") pod "7220ea42-daf2-4c41-85c9-0d2bda6d24eb" (UID: "7220ea42-daf2-4c41-85c9-0d2bda6d24eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.575715 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-kube-api-access-kxjf8" (OuterVolumeSpecName: "kube-api-access-kxjf8") pod "7220ea42-daf2-4c41-85c9-0d2bda6d24eb" (UID: "7220ea42-daf2-4c41-85c9-0d2bda6d24eb"). InnerVolumeSpecName "kube-api-access-kxjf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.611946 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7220ea42-daf2-4c41-85c9-0d2bda6d24eb" (UID: "7220ea42-daf2-4c41-85c9-0d2bda6d24eb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.647450 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.647503 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.647518 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.647531 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxjf8\" (UniqueName: \"kubernetes.io/projected/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-kube-api-access-kxjf8\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.647543 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.703163 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7220ea42-daf2-4c41-85c9-0d2bda6d24eb" (UID: "7220ea42-daf2-4c41-85c9-0d2bda6d24eb"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.750495 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.787537 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7220ea42-daf2-4c41-85c9-0d2bda6d24eb" (UID: "7220ea42-daf2-4c41-85c9-0d2bda6d24eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.799446 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-config-data" (OuterVolumeSpecName: "config-data") pod "7220ea42-daf2-4c41-85c9-0d2bda6d24eb" (UID: "7220ea42-daf2-4c41-85c9-0d2bda6d24eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.852736 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.852798 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.039164 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.039195 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9d94c4c7-px8jh" event={"ID":"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243","Type":"ContainerDied","Data":"dc93d1c29f222a1602e24053b4830b3eddb16bbf1d9374aeb6068fdf3ba6030f"} Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.078774 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9d94c4c7-px8jh"] Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.081530 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d","Type":"ContainerStarted","Data":"8f2ad30ba8f80d63139adfaea98a7ecd975946f15c7c05e8f173a403c0e77b0a"} Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.081823 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.095674 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-9d94c4c7-px8jh"] Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.100845 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7220ea42-daf2-4c41-85c9-0d2bda6d24eb","Type":"ContainerDied","Data":"83415e6b4960e541c9fc0ec3cd4865cce73b704e20a342572bf182a8978c8bc9"} Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.101059 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.115027 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.115007431 podStartE2EDuration="4.115007431s" podCreationTimestamp="2026-03-09 14:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:10:33.109446332 +0000 UTC m=+2988.359618270" watchObservedRunningTime="2026-03-09 14:10:33.115007431 +0000 UTC m=+2988.365179339" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.181815 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.252302 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.267980 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:33 crc kubenswrapper[4764]: E0309 14:10:33.268724 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" containerName="horizon" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.268744 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" containerName="horizon" Mar 09 14:10:33 crc kubenswrapper[4764]: E0309 14:10:33.268765 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="ceilometer-central-agent" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.268773 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="ceilometer-central-agent" Mar 09 14:10:33 crc kubenswrapper[4764]: E0309 14:10:33.268793 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" containerName="horizon-log" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.268802 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" containerName="horizon-log" Mar 09 14:10:33 crc kubenswrapper[4764]: E0309 14:10:33.268817 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="ceilometer-notification-agent" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.268824 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="ceilometer-notification-agent" Mar 09 14:10:33 crc kubenswrapper[4764]: E0309 14:10:33.268842 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" containerName="horizon-log" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.268852 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" containerName="horizon-log" Mar 09 14:10:33 crc kubenswrapper[4764]: E0309 14:10:33.268870 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="proxy-httpd" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.268877 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="proxy-httpd" Mar 09 14:10:33 crc kubenswrapper[4764]: E0309 14:10:33.268904 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" containerName="horizon" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.268912 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" containerName="horizon" Mar 09 14:10:33 crc kubenswrapper[4764]: E0309 14:10:33.268926 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="sg-core" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.268944 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="sg-core" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.269162 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="proxy-httpd" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.269176 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="sg-core" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.269183 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="ceilometer-notification-agent" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.269193 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" containerName="horizon-log" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.269204 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" containerName="horizon" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.269216 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" containerName="horizon-log" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.269225 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="ceilometer-central-agent" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.269239 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" containerName="horizon" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.271545 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.279758 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.281289 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.281806 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.282163 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.312395 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.381564 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.381704 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-log-httpd\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.381772 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.381841 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-scripts\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.381878 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-config-data\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.381944 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-run-httpd\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.382044 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccg6p\" (UniqueName: \"kubernetes.io/projected/cba66048-6589-4bda-99bc-f2b62d5a16cd-kube-api-access-ccg6p\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.382103 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.444282 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.485255 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-run-httpd\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.485343 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccg6p\" (UniqueName: \"kubernetes.io/projected/cba66048-6589-4bda-99bc-f2b62d5a16cd-kube-api-access-ccg6p\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.485378 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.485430 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.485457 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-log-httpd\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.485501 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.485546 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-scripts\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.485590 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-config-data\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.487610 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-run-httpd\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.488068 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-log-httpd\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.494496 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.495165 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.510534 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.512599 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-config-data\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.524944 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-scripts\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.538405 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccg6p\" (UniqueName: \"kubernetes.io/projected/cba66048-6589-4bda-99bc-f2b62d5a16cd-kube-api-access-ccg6p\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.584272 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" path="/var/lib/kubelet/pods/6b944cf9-8278-4b16-b09c-0da6a2519b2a/volumes" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.585285 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" path="/var/lib/kubelet/pods/7220ea42-daf2-4c41-85c9-0d2bda6d24eb/volumes" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.586798 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" path="/var/lib/kubelet/pods/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243/volumes" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.587526 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-797d44c9b-wrlx7"] Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.683403 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:10:34 crc kubenswrapper[4764]: I0309 14:10:34.113135 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-797d44c9b-wrlx7" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon-log" containerID="cri-o://ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96" gracePeriod=30 Mar 09 14:10:34 crc kubenswrapper[4764]: I0309 14:10:34.113291 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-797d44c9b-wrlx7" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon" containerID="cri-o://c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0" gracePeriod=30 Mar 09 14:10:34 crc kubenswrapper[4764]: I0309 14:10:34.120123 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 09 14:10:34 crc kubenswrapper[4764]: I0309 14:10:34.339857 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:34 crc kubenswrapper[4764]: I0309 14:10:34.477443 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-688wh"] Mar 09 14:10:34 crc kubenswrapper[4764]: I0309 14:10:34.478063 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" podUID="6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" containerName="dnsmasq-dns" containerID="cri-o://0b92f70675fc36bf2d871b5244dabd98b74bc41c0324eee9ebebf35ece42f834" gracePeriod=10 Mar 09 14:10:35 crc kubenswrapper[4764]: I0309 14:10:35.134919 4764 generic.go:334] "Generic (PLEG): container finished" podID="6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" containerID="0b92f70675fc36bf2d871b5244dabd98b74bc41c0324eee9ebebf35ece42f834" exitCode=0 Mar 09 14:10:35 crc kubenswrapper[4764]: I0309 14:10:35.134982 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" event={"ID":"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff","Type":"ContainerDied","Data":"0b92f70675fc36bf2d871b5244dabd98b74bc41c0324eee9ebebf35ece42f834"} Mar 09 14:10:36 crc kubenswrapper[4764]: I0309 14:10:36.718632 4764 scope.go:117] "RemoveContainer" containerID="b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd" Mar 09 14:10:36 crc kubenswrapper[4764]: E0309 14:10:36.719369 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd\": container with ID starting with b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd not found: ID does not exist" containerID="b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd" Mar 09 14:10:36 crc kubenswrapper[4764]: I0309 14:10:36.719437 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd"} err="failed to get container status \"b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd\": rpc error: code = NotFound desc = could not find container \"b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd\": container with ID starting with b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd not found: ID does not exist" Mar 09 14:10:36 crc kubenswrapper[4764]: I0309 14:10:36.719472 4764 scope.go:117] "RemoveContainer" containerID="1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1" Mar 09 14:10:36 crc kubenswrapper[4764]: E0309 14:10:36.719873 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1\": container with ID starting with 1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1 not found: ID does not exist" containerID="1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1" Mar 09 14:10:36 crc kubenswrapper[4764]: I0309 14:10:36.719942 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1"} err="failed to get container status \"1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1\": rpc error: code = NotFound desc = could not find container \"1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1\": container with ID starting with 1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1 not found: ID does not exist" Mar 09 14:10:36 crc kubenswrapper[4764]: I0309 14:10:36.720004 4764 scope.go:117] "RemoveContainer" containerID="b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd" Mar 09 14:10:36 crc kubenswrapper[4764]: I0309 14:10:36.720203 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd"} err="failed to get container status \"b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd\": rpc error: code = NotFound desc = could not find container \"b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd\": container with ID starting with b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd not found: ID does not exist" Mar 09 14:10:36 crc kubenswrapper[4764]: I0309 14:10:36.720252 4764 scope.go:117] "RemoveContainer" containerID="1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1" Mar 09 14:10:36 crc kubenswrapper[4764]: I0309 14:10:36.720434 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1"} err="failed to get container status \"1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1\": rpc error: code = NotFound desc = could not find container \"1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1\": container with ID starting with 1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1 not found: ID does not exist" Mar 09 14:10:36 crc kubenswrapper[4764]: I0309 14:10:36.720477 4764 scope.go:117] "RemoveContainer" containerID="c5d8efe5920ca730c02900ccbfc9cdc3905256778569090d306fce8d9fecf051" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.018430 4764 scope.go:117] "RemoveContainer" containerID="fabc9d27abd14d14db86a85f4413de77c2fd101b8722a207f502ad9ee60b4405" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.177339 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" event={"ID":"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff","Type":"ContainerDied","Data":"7623e23ae56b8d47625b88ad80b059396febc09e926f11449c6996f010201d29"} Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.177761 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7623e23ae56b8d47625b88ad80b059396febc09e926f11449c6996f010201d29" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.185354 4764 scope.go:117] "RemoveContainer" containerID="5ee7ad3de1f9928c6b59a6676d88b6e84ba3c1fa18e74865387fac5e3b0fa60c" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.227929 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.256891 4764 scope.go:117] "RemoveContainer" containerID="c44f22885081c03e82b6817f6378545ad23fd40c3fd48004324856781c5d89d8" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.292513 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-797d44c9b-wrlx7" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.5:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:48566->10.217.1.5:8443: read: connection reset by peer" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.293499 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j76rw\" (UniqueName: \"kubernetes.io/projected/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-kube-api-access-j76rw\") pod \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.293558 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-dns-svc\") pod \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.293714 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-openstack-edpm-ipam\") pod \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.293850 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-config\") pod \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.293890 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-sb\") pod \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.294577 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-nb\") pod \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.308088 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-kube-api-access-j76rw" (OuterVolumeSpecName: "kube-api-access-j76rw") pod "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" (UID: "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff"). InnerVolumeSpecName "kube-api-access-j76rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.311914 4764 scope.go:117] "RemoveContainer" containerID="cb0403e3d98f6b11f541bb3ed05951b3aa36eed0ed52f4863be171ce5eaf63e2" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.377925 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" (UID: "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.398718 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.399077 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j76rw\" (UniqueName: \"kubernetes.io/projected/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-kube-api-access-j76rw\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.430390 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" (UID: "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.433240 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" (UID: "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.460372 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" (UID: "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.466699 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-config" (OuterVolumeSpecName: "config") pod "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" (UID: "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.467288 4764 scope.go:117] "RemoveContainer" containerID="ff9bf0b1ebb34e175d58582b99ddb3f1097e979301d7cd95e80e6e75808835d6" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.501828 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.501875 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.501894 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.501905 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.577417 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:37 crc kubenswrapper[4764]: W0309 14:10:37.584290 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcba66048_6589_4bda_99bc_f2b62d5a16cd.slice/crio-214f8758c7bcbceff8c2118cea823d3832d509fe2a476aefc1476edcb74a1b81 WatchSource:0}: Error finding container 214f8758c7bcbceff8c2118cea823d3832d509fe2a476aefc1476edcb74a1b81: Status 404 returned error can't find the container with id 214f8758c7bcbceff8c2118cea823d3832d509fe2a476aefc1476edcb74a1b81 Mar 09 14:10:38 crc kubenswrapper[4764]: I0309 14:10:38.197270 4764 generic.go:334] "Generic (PLEG): container finished" podID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerID="c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0" exitCode=0 Mar 09 14:10:38 crc kubenswrapper[4764]: I0309 14:10:38.197491 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797d44c9b-wrlx7" event={"ID":"e00fc104-ec73-4190-a598-86de7ca6cfa5","Type":"ContainerDied","Data":"c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0"} Mar 09 14:10:38 crc kubenswrapper[4764]: I0309 14:10:38.209417 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba66048-6589-4bda-99bc-f2b62d5a16cd","Type":"ContainerStarted","Data":"214f8758c7bcbceff8c2118cea823d3832d509fe2a476aefc1476edcb74a1b81"} Mar 09 14:10:38 crc kubenswrapper[4764]: I0309 14:10:38.216325 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 14:10:38 crc kubenswrapper[4764]: I0309 14:10:38.217361 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"604c4e24-15e4-43e1-b08c-74d8337a2e71","Type":"ContainerStarted","Data":"824266a7cc9cb5d7239bcaba8f3fa368c333aac554ec8e5c7b1cc64368838f39"} Mar 09 14:10:38 crc kubenswrapper[4764]: I0309 14:10:38.289466 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-688wh"] Mar 09 14:10:38 crc kubenswrapper[4764]: I0309 14:10:38.308084 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-688wh"] Mar 09 14:10:39 crc kubenswrapper[4764]: I0309 14:10:39.226900 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:39 crc kubenswrapper[4764]: I0309 14:10:39.232713 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba66048-6589-4bda-99bc-f2b62d5a16cd","Type":"ContainerStarted","Data":"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0"} Mar 09 14:10:39 crc kubenswrapper[4764]: I0309 14:10:39.232791 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba66048-6589-4bda-99bc-f2b62d5a16cd","Type":"ContainerStarted","Data":"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46"} Mar 09 14:10:39 crc kubenswrapper[4764]: I0309 14:10:39.241323 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"604c4e24-15e4-43e1-b08c-74d8337a2e71","Type":"ContainerStarted","Data":"03c13ad847d2a8fd02eb6ed9a1ace4039b7b02e90605b4014e12bc57ed3abe55"} Mar 09 14:10:39 crc kubenswrapper[4764]: I0309 14:10:39.275405 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.27057249 podStartE2EDuration="16.275383565s" podCreationTimestamp="2026-03-09 14:10:23 +0000 UTC" firstStartedPulling="2026-03-09 14:10:25.024464499 +0000 UTC m=+2980.274636407" lastFinishedPulling="2026-03-09 14:10:37.029275574 +0000 UTC m=+2992.279447482" observedRunningTime="2026-03-09 14:10:39.267007172 +0000 UTC m=+2994.517179080" watchObservedRunningTime="2026-03-09 14:10:39.275383565 +0000 UTC m=+2994.525555473" Mar 09 14:10:39 crc kubenswrapper[4764]: I0309 14:10:39.574118 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" path="/var/lib/kubelet/pods/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff/volumes" Mar 09 14:10:40 crc kubenswrapper[4764]: I0309 14:10:40.255363 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba66048-6589-4bda-99bc-f2b62d5a16cd","Type":"ContainerStarted","Data":"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa"} Mar 09 14:10:43 crc kubenswrapper[4764]: I0309 14:10:43.287664 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba66048-6589-4bda-99bc-f2b62d5a16cd","Type":"ContainerStarted","Data":"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f"} Mar 09 14:10:43 crc kubenswrapper[4764]: I0309 14:10:43.287850 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="ceilometer-central-agent" containerID="cri-o://7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46" gracePeriod=30 Mar 09 14:10:43 crc kubenswrapper[4764]: I0309 14:10:43.287916 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="proxy-httpd" containerID="cri-o://606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f" gracePeriod=30 Mar 09 14:10:43 crc kubenswrapper[4764]: I0309 14:10:43.287930 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="sg-core" containerID="cri-o://9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa" gracePeriod=30 Mar 09 14:10:43 crc kubenswrapper[4764]: I0309 14:10:43.287940 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="ceilometer-notification-agent" containerID="cri-o://9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0" gracePeriod=30 Mar 09 14:10:43 crc kubenswrapper[4764]: I0309 14:10:43.290868 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 14:10:43 crc kubenswrapper[4764]: I0309 14:10:43.320067 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.767577018 podStartE2EDuration="10.32003947s" podCreationTimestamp="2026-03-09 14:10:33 +0000 UTC" firstStartedPulling="2026-03-09 14:10:37.58978513 +0000 UTC m=+2992.839957028" lastFinishedPulling="2026-03-09 14:10:42.142247572 +0000 UTC m=+2997.392419480" observedRunningTime="2026-03-09 14:10:43.313558037 +0000 UTC m=+2998.563729945" watchObservedRunningTime="2026-03-09 14:10:43.32003947 +0000 UTC m=+2998.570211408" Mar 09 14:10:43 crc kubenswrapper[4764]: I0309 14:10:43.560757 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:10:43 crc kubenswrapper[4764]: E0309 14:10:43.561735 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.101317 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.179935 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.188078 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccg6p\" (UniqueName: \"kubernetes.io/projected/cba66048-6589-4bda-99bc-f2b62d5a16cd-kube-api-access-ccg6p\") pod \"cba66048-6589-4bda-99bc-f2b62d5a16cd\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.188196 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-run-httpd\") pod \"cba66048-6589-4bda-99bc-f2b62d5a16cd\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.188309 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-combined-ca-bundle\") pod \"cba66048-6589-4bda-99bc-f2b62d5a16cd\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.188369 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-log-httpd\") pod \"cba66048-6589-4bda-99bc-f2b62d5a16cd\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.188400 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-config-data\") pod \"cba66048-6589-4bda-99bc-f2b62d5a16cd\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.188541 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-ceilometer-tls-certs\") pod \"cba66048-6589-4bda-99bc-f2b62d5a16cd\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.188709 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-sg-core-conf-yaml\") pod \"cba66048-6589-4bda-99bc-f2b62d5a16cd\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.188773 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-scripts\") pod \"cba66048-6589-4bda-99bc-f2b62d5a16cd\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.189036 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cba66048-6589-4bda-99bc-f2b62d5a16cd" (UID: "cba66048-6589-4bda-99bc-f2b62d5a16cd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.189456 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.190214 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cba66048-6589-4bda-99bc-f2b62d5a16cd" (UID: "cba66048-6589-4bda-99bc-f2b62d5a16cd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.195957 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba66048-6589-4bda-99bc-f2b62d5a16cd-kube-api-access-ccg6p" (OuterVolumeSpecName: "kube-api-access-ccg6p") pod "cba66048-6589-4bda-99bc-f2b62d5a16cd" (UID: "cba66048-6589-4bda-99bc-f2b62d5a16cd"). InnerVolumeSpecName "kube-api-access-ccg6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.196011 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-scripts" (OuterVolumeSpecName: "scripts") pod "cba66048-6589-4bda-99bc-f2b62d5a16cd" (UID: "cba66048-6589-4bda-99bc-f2b62d5a16cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.231551 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cba66048-6589-4bda-99bc-f2b62d5a16cd" (UID: "cba66048-6589-4bda-99bc-f2b62d5a16cd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.247830 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "cba66048-6589-4bda-99bc-f2b62d5a16cd" (UID: "cba66048-6589-4bda-99bc-f2b62d5a16cd"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.288898 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cba66048-6589-4bda-99bc-f2b62d5a16cd" (UID: "cba66048-6589-4bda-99bc-f2b62d5a16cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.292762 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.292797 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.292811 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccg6p\" (UniqueName: \"kubernetes.io/projected/cba66048-6589-4bda-99bc-f2b62d5a16cd-kube-api-access-ccg6p\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.292826 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.292838 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.292850 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.301961 4764 generic.go:334] "Generic (PLEG): container finished" podID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerID="606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f" exitCode=0 Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.302001 4764 generic.go:334] "Generic (PLEG): container finished" podID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerID="9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa" exitCode=2 Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.302012 4764 generic.go:334] "Generic (PLEG): container finished" podID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerID="9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0" exitCode=0 Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.302022 4764 generic.go:334] "Generic (PLEG): container finished" podID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerID="7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46" exitCode=0 Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.302054 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.302051 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba66048-6589-4bda-99bc-f2b62d5a16cd","Type":"ContainerDied","Data":"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f"} Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.302123 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba66048-6589-4bda-99bc-f2b62d5a16cd","Type":"ContainerDied","Data":"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa"} Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.302138 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba66048-6589-4bda-99bc-f2b62d5a16cd","Type":"ContainerDied","Data":"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0"} Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.302177 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba66048-6589-4bda-99bc-f2b62d5a16cd","Type":"ContainerDied","Data":"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46"} Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.302186 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba66048-6589-4bda-99bc-f2b62d5a16cd","Type":"ContainerDied","Data":"214f8758c7bcbceff8c2118cea823d3832d509fe2a476aefc1476edcb74a1b81"} Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.302184 4764 scope.go:117] "RemoveContainer" containerID="606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.328201 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-config-data" (OuterVolumeSpecName: "config-data") pod "cba66048-6589-4bda-99bc-f2b62d5a16cd" (UID: "cba66048-6589-4bda-99bc-f2b62d5a16cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.333328 4764 scope.go:117] "RemoveContainer" containerID="9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.355439 4764 scope.go:117] "RemoveContainer" containerID="9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.378860 4764 scope.go:117] "RemoveContainer" containerID="7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.395274 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.399833 4764 scope.go:117] "RemoveContainer" containerID="606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f" Mar 09 14:10:44 crc kubenswrapper[4764]: E0309 14:10:44.410499 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f\": container with ID starting with 606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f not found: ID does not exist" containerID="606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.410544 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f"} err="failed to get container status \"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f\": rpc error: code = NotFound desc = could not find container \"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f\": container with ID starting with 606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.410573 4764 scope.go:117] "RemoveContainer" containerID="9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa" Mar 09 14:10:44 crc kubenswrapper[4764]: E0309 14:10:44.411167 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa\": container with ID starting with 9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa not found: ID does not exist" containerID="9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.411188 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa"} err="failed to get container status \"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa\": rpc error: code = NotFound desc = could not find container \"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa\": container with ID starting with 9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.411202 4764 scope.go:117] "RemoveContainer" containerID="9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0" Mar 09 14:10:44 crc kubenswrapper[4764]: E0309 14:10:44.411465 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0\": container with ID starting with 9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0 not found: ID does not exist" containerID="9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.411485 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0"} err="failed to get container status \"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0\": rpc error: code = NotFound desc = could not find container \"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0\": container with ID starting with 9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0 not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.411498 4764 scope.go:117] "RemoveContainer" containerID="7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46" Mar 09 14:10:44 crc kubenswrapper[4764]: E0309 14:10:44.411754 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46\": container with ID starting with 7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46 not found: ID does not exist" containerID="7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.411773 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46"} err="failed to get container status \"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46\": rpc error: code = NotFound desc = could not find container \"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46\": container with ID starting with 7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46 not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.411785 4764 scope.go:117] "RemoveContainer" containerID="606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.412122 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f"} err="failed to get container status \"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f\": rpc error: code = NotFound desc = could not find container \"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f\": container with ID starting with 606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.412141 4764 scope.go:117] "RemoveContainer" containerID="9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.412343 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa"} err="failed to get container status \"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa\": rpc error: code = NotFound desc = could not find container \"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa\": container with ID starting with 9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.412362 4764 scope.go:117] "RemoveContainer" containerID="9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.412667 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0"} err="failed to get container status \"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0\": rpc error: code = NotFound desc = could not find container \"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0\": container with ID starting with 9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0 not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.412686 4764 scope.go:117] "RemoveContainer" containerID="7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.412896 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46"} err="failed to get container status \"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46\": rpc error: code = NotFound desc = could not find container \"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46\": container with ID starting with 7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46 not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.412918 4764 scope.go:117] "RemoveContainer" containerID="606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.413102 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f"} err="failed to get container status \"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f\": rpc error: code = NotFound desc = could not find container \"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f\": container with ID starting with 606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.413121 4764 scope.go:117] "RemoveContainer" containerID="9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.413374 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa"} err="failed to get container status \"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa\": rpc error: code = NotFound desc = could not find container \"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa\": container with ID starting with 9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.413394 4764 scope.go:117] "RemoveContainer" containerID="9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.413662 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0"} err="failed to get container status \"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0\": rpc error: code = NotFound desc = could not find container \"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0\": container with ID starting with 9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0 not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.413680 4764 scope.go:117] "RemoveContainer" containerID="7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.413928 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46"} err="failed to get container status \"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46\": rpc error: code = NotFound desc = could not find container \"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46\": container with ID starting with 7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46 not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.413947 4764 scope.go:117] "RemoveContainer" containerID="606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.414185 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f"} err="failed to get container status \"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f\": rpc error: code = NotFound desc = could not find container \"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f\": container with ID starting with 606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.414237 4764 scope.go:117] "RemoveContainer" containerID="9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.414505 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa"} err="failed to get container status \"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa\": rpc error: code = NotFound desc = could not find container \"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa\": container with ID starting with 9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.414525 4764 scope.go:117] "RemoveContainer" containerID="9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.414746 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0"} err="failed to get container status \"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0\": rpc error: code = NotFound desc = could not find container \"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0\": container with ID starting with 9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0 not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.414770 4764 scope.go:117] "RemoveContainer" containerID="7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.415044 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46"} err="failed to get container status \"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46\": rpc error: code = NotFound desc = could not find container \"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46\": container with ID starting with 7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46 not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.649358 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.663137 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.688277 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:44 crc kubenswrapper[4764]: E0309 14:10:44.689321 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="ceilometer-notification-agent" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.689422 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="ceilometer-notification-agent" Mar 09 14:10:44 crc kubenswrapper[4764]: E0309 14:10:44.689528 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" containerName="init" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.690141 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" containerName="init" Mar 09 14:10:44 crc kubenswrapper[4764]: E0309 14:10:44.690263 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" containerName="dnsmasq-dns" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.690330 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" containerName="dnsmasq-dns" Mar 09 14:10:44 crc kubenswrapper[4764]: E0309 14:10:44.690456 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="ceilometer-central-agent" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.690525 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="ceilometer-central-agent" Mar 09 14:10:44 crc kubenswrapper[4764]: E0309 14:10:44.690608 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="sg-core" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.690701 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="sg-core" Mar 09 14:10:44 crc kubenswrapper[4764]: E0309 14:10:44.690790 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="proxy-httpd" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.690955 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="proxy-httpd" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.691417 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="ceilometer-central-agent" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.691504 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" containerName="dnsmasq-dns" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.691574 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="sg-core" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.691672 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="ceilometer-notification-agent" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.691754 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="proxy-httpd" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.694858 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.707668 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.710410 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.710801 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.710910 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.816482 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.816949 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.817036 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-scripts\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.817060 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.817097 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-config-data\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.817128 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n74nk\" (UniqueName: \"kubernetes.io/projected/182886b1-5569-456a-aa1e-129021e95bfe-kube-api-access-n74nk\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.817199 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182886b1-5569-456a-aa1e-129021e95bfe-run-httpd\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.817224 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182886b1-5569-456a-aa1e-129021e95bfe-log-httpd\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.919911 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.920014 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.920216 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-scripts\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.920251 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.920315 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-config-data\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.920373 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n74nk\" (UniqueName: \"kubernetes.io/projected/182886b1-5569-456a-aa1e-129021e95bfe-kube-api-access-n74nk\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.920467 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182886b1-5569-456a-aa1e-129021e95bfe-run-httpd\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.920495 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182886b1-5569-456a-aa1e-129021e95bfe-log-httpd\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.921663 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182886b1-5569-456a-aa1e-129021e95bfe-log-httpd\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.922775 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182886b1-5569-456a-aa1e-129021e95bfe-run-httpd\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.929160 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.931890 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-scripts\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.932441 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-config-data\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.932524 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.932836 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.951210 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n74nk\" (UniqueName: \"kubernetes.io/projected/182886b1-5569-456a-aa1e-129021e95bfe-kube-api-access-n74nk\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:45 crc kubenswrapper[4764]: I0309 14:10:45.037595 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:10:45 crc kubenswrapper[4764]: I0309 14:10:45.525787 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:45 crc kubenswrapper[4764]: I0309 14:10:45.598911 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" path="/var/lib/kubelet/pods/cba66048-6589-4bda-99bc-f2b62d5a16cd/volumes" Mar 09 14:10:46 crc kubenswrapper[4764]: I0309 14:10:46.085943 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 09 14:10:46 crc kubenswrapper[4764]: I0309 14:10:46.135813 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 14:10:46 crc kubenswrapper[4764]: I0309 14:10:46.334569 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"182886b1-5569-456a-aa1e-129021e95bfe","Type":"ContainerStarted","Data":"ede734f8333a2fa733a5fce350a7ca032ce48234ca605f47c897a4fb42ea2a53"} Mar 09 14:10:46 crc kubenswrapper[4764]: I0309 14:10:46.335091 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" containerName="manila-scheduler" containerID="cri-o://b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7" gracePeriod=30 Mar 09 14:10:46 crc kubenswrapper[4764]: I0309 14:10:46.335164 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" containerName="probe" containerID="cri-o://cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9" gracePeriod=30 Mar 09 14:10:47 crc kubenswrapper[4764]: I0309 14:10:47.273598 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-797d44c9b-wrlx7" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.5:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.5:8443: connect: connection refused" Mar 09 14:10:47 crc kubenswrapper[4764]: I0309 14:10:47.346335 4764 generic.go:334] "Generic (PLEG): container finished" podID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" containerID="cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9" exitCode=0 Mar 09 14:10:47 crc kubenswrapper[4764]: I0309 14:10:47.346458 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"30f42bb2-6e26-4cbb-942b-a7d4ede4f128","Type":"ContainerDied","Data":"cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9"} Mar 09 14:10:47 crc kubenswrapper[4764]: I0309 14:10:47.348238 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"182886b1-5569-456a-aa1e-129021e95bfe","Type":"ContainerStarted","Data":"c221b73ef18ea3e73a5026b139c9df4d59925e3fefafdd30786207edc3ef50b0"} Mar 09 14:10:48 crc kubenswrapper[4764]: I0309 14:10:48.359670 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"182886b1-5569-456a-aa1e-129021e95bfe","Type":"ContainerStarted","Data":"4673bf510635fd12a6bc053c4b946a3d5cf647ce3ddfb9f8f5fa850ae40a11dc"} Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.150565 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.230183 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-combined-ca-bundle\") pod \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.230348 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-etc-machine-id\") pod \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.230371 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-scripts\") pod \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.230403 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data\") pod \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.230524 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq2r9\" (UniqueName: \"kubernetes.io/projected/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-kube-api-access-lq2r9\") pod \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.230575 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data-custom\") pod \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.231291 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "30f42bb2-6e26-4cbb-942b-a7d4ede4f128" (UID: "30f42bb2-6e26-4cbb-942b-a7d4ede4f128"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.240880 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "30f42bb2-6e26-4cbb-942b-a7d4ede4f128" (UID: "30f42bb2-6e26-4cbb-942b-a7d4ede4f128"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.241984 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-kube-api-access-lq2r9" (OuterVolumeSpecName: "kube-api-access-lq2r9") pod "30f42bb2-6e26-4cbb-942b-a7d4ede4f128" (UID: "30f42bb2-6e26-4cbb-942b-a7d4ede4f128"). InnerVolumeSpecName "kube-api-access-lq2r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.252850 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-scripts" (OuterVolumeSpecName: "scripts") pod "30f42bb2-6e26-4cbb-942b-a7d4ede4f128" (UID: "30f42bb2-6e26-4cbb-942b-a7d4ede4f128"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.334595 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.334661 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.334676 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq2r9\" (UniqueName: \"kubernetes.io/projected/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-kube-api-access-lq2r9\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.334688 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.345899 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30f42bb2-6e26-4cbb-942b-a7d4ede4f128" (UID: "30f42bb2-6e26-4cbb-942b-a7d4ede4f128"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.373184 4764 generic.go:334] "Generic (PLEG): container finished" podID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" containerID="b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7" exitCode=0 Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.373280 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"30f42bb2-6e26-4cbb-942b-a7d4ede4f128","Type":"ContainerDied","Data":"b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7"} Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.373299 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.373322 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"30f42bb2-6e26-4cbb-942b-a7d4ede4f128","Type":"ContainerDied","Data":"69e9c4821e7bd2b8ea5bd5a2a4ce9a1fe30fca8c9b901cad13802f0443f55e38"} Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.373348 4764 scope.go:117] "RemoveContainer" containerID="cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.379053 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"182886b1-5569-456a-aa1e-129021e95bfe","Type":"ContainerStarted","Data":"f7a1e9b9529e4d8555cc65fbd8d4b0c25e0e18ccde1cec73861b294fca116b0e"} Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.410019 4764 scope.go:117] "RemoveContainer" containerID="b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.436682 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.441576 4764 scope.go:117] "RemoveContainer" containerID="cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9" Mar 09 14:10:49 crc kubenswrapper[4764]: E0309 14:10:49.442277 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9\": container with ID starting with cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9 not found: ID does not exist" containerID="cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.442323 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9"} err="failed to get container status \"cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9\": rpc error: code = NotFound desc = could not find container \"cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9\": container with ID starting with cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9 not found: ID does not exist" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.442350 4764 scope.go:117] "RemoveContainer" containerID="b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7" Mar 09 14:10:49 crc kubenswrapper[4764]: E0309 14:10:49.442934 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7\": container with ID starting with b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7 not found: ID does not exist" containerID="b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.443064 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7"} err="failed to get container status \"b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7\": rpc error: code = NotFound desc = could not find container \"b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7\": container with ID starting with b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7 not found: ID does not exist" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.445905 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data" (OuterVolumeSpecName: "config-data") pod "30f42bb2-6e26-4cbb-942b-a7d4ede4f128" (UID: "30f42bb2-6e26-4cbb-942b-a7d4ede4f128"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.539174 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.732531 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.744783 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.755641 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 14:10:49 crc kubenswrapper[4764]: E0309 14:10:49.756134 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" containerName="probe" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.756154 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" containerName="probe" Mar 09 14:10:49 crc kubenswrapper[4764]: E0309 14:10:49.756169 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" containerName="manila-scheduler" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.756176 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" containerName="manila-scheduler" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.756369 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" containerName="manila-scheduler" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.756384 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" containerName="probe" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.757515 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.761466 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.772123 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.847334 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/492d78a8-09ea-4239-a53f-b8d0480fcf36-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.847461 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.847687 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-scripts\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.847757 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-config-data\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.847957 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsflw\" (UniqueName: \"kubernetes.io/projected/492d78a8-09ea-4239-a53f-b8d0480fcf36-kube-api-access-bsflw\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.848120 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.950762 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsflw\" (UniqueName: \"kubernetes.io/projected/492d78a8-09ea-4239-a53f-b8d0480fcf36-kube-api-access-bsflw\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.950951 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.951022 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/492d78a8-09ea-4239-a53f-b8d0480fcf36-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.951095 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.951153 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-scripts\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.951186 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-config-data\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.951961 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/492d78a8-09ea-4239-a53f-b8d0480fcf36-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.956399 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.956430 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.957541 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-config-data\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.961942 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-scripts\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.969178 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsflw\" (UniqueName: \"kubernetes.io/projected/492d78a8-09ea-4239-a53f-b8d0480fcf36-kube-api-access-bsflw\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:50 crc kubenswrapper[4764]: I0309 14:10:50.136102 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 09 14:10:50 crc kubenswrapper[4764]: I0309 14:10:50.473620 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 14:10:50 crc kubenswrapper[4764]: W0309 14:10:50.481012 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod492d78a8_09ea_4239_a53f_b8d0480fcf36.slice/crio-df86a97d5a10593d6e7f1198c9631d358e374b4e5920b01ae0d48bd2b50454bb WatchSource:0}: Error finding container df86a97d5a10593d6e7f1198c9631d358e374b4e5920b01ae0d48bd2b50454bb: Status 404 returned error can't find the container with id df86a97d5a10593d6e7f1198c9631d358e374b4e5920b01ae0d48bd2b50454bb Mar 09 14:10:51 crc kubenswrapper[4764]: I0309 14:10:51.408722 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"492d78a8-09ea-4239-a53f-b8d0480fcf36","Type":"ContainerStarted","Data":"bb4283f38719f6811bc261c4481786fa88f25ac364ff8b2ad91f4031e6d2e769"} Mar 09 14:10:51 crc kubenswrapper[4764]: I0309 14:10:51.409560 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"492d78a8-09ea-4239-a53f-b8d0480fcf36","Type":"ContainerStarted","Data":"df86a97d5a10593d6e7f1198c9631d358e374b4e5920b01ae0d48bd2b50454bb"} Mar 09 14:10:51 crc kubenswrapper[4764]: I0309 14:10:51.436457 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"182886b1-5569-456a-aa1e-129021e95bfe","Type":"ContainerStarted","Data":"962128e6d28446f61932e6679464215a3def7afa777dff5d7a16332a4480165a"} Mar 09 14:10:51 crc kubenswrapper[4764]: I0309 14:10:51.438182 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 14:10:51 crc kubenswrapper[4764]: I0309 14:10:51.474030 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.7179550260000003 podStartE2EDuration="7.473998004s" podCreationTimestamp="2026-03-09 14:10:44 +0000 UTC" firstStartedPulling="2026-03-09 14:10:45.577503275 +0000 UTC m=+3000.827675183" lastFinishedPulling="2026-03-09 14:10:50.333546253 +0000 UTC m=+3005.583718161" observedRunningTime="2026-03-09 14:10:51.46035396 +0000 UTC m=+3006.710525878" watchObservedRunningTime="2026-03-09 14:10:51.473998004 +0000 UTC m=+3006.724169922" Mar 09 14:10:51 crc kubenswrapper[4764]: I0309 14:10:51.573752 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" path="/var/lib/kubelet/pods/30f42bb2-6e26-4cbb-942b-a7d4ede4f128/volumes" Mar 09 14:10:51 crc kubenswrapper[4764]: I0309 14:10:51.795929 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Mar 09 14:10:52 crc kubenswrapper[4764]: I0309 14:10:52.450790 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"492d78a8-09ea-4239-a53f-b8d0480fcf36","Type":"ContainerStarted","Data":"79b4c0474938cd83712d03dc98824c4915b148a402a45ba43f0f180e88656641"} Mar 09 14:10:52 crc kubenswrapper[4764]: I0309 14:10:52.493711 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.493681621 podStartE2EDuration="3.493681621s" podCreationTimestamp="2026-03-09 14:10:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:10:52.478797703 +0000 UTC m=+3007.728969631" watchObservedRunningTime="2026-03-09 14:10:52.493681621 +0000 UTC m=+3007.743853559" Mar 09 14:10:55 crc kubenswrapper[4764]: I0309 14:10:55.928246 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 09 14:10:56 crc kubenswrapper[4764]: I0309 14:10:56.018765 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 14:10:56 crc kubenswrapper[4764]: I0309 14:10:56.490755 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="604c4e24-15e4-43e1-b08c-74d8337a2e71" containerName="manila-share" containerID="cri-o://824266a7cc9cb5d7239bcaba8f3fa368c333aac554ec8e5c7b1cc64368838f39" gracePeriod=30 Mar 09 14:10:56 crc kubenswrapper[4764]: I0309 14:10:56.490807 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="604c4e24-15e4-43e1-b08c-74d8337a2e71" containerName="probe" containerID="cri-o://03c13ad847d2a8fd02eb6ed9a1ace4039b7b02e90605b4014e12bc57ed3abe55" gracePeriod=30 Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.274126 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-797d44c9b-wrlx7" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.5:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.5:8443: connect: connection refused" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.275057 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.525425 4764 generic.go:334] "Generic (PLEG): container finished" podID="604c4e24-15e4-43e1-b08c-74d8337a2e71" containerID="03c13ad847d2a8fd02eb6ed9a1ace4039b7b02e90605b4014e12bc57ed3abe55" exitCode=0 Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.525466 4764 generic.go:334] "Generic (PLEG): container finished" podID="604c4e24-15e4-43e1-b08c-74d8337a2e71" containerID="824266a7cc9cb5d7239bcaba8f3fa368c333aac554ec8e5c7b1cc64368838f39" exitCode=1 Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.525489 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"604c4e24-15e4-43e1-b08c-74d8337a2e71","Type":"ContainerDied","Data":"03c13ad847d2a8fd02eb6ed9a1ace4039b7b02e90605b4014e12bc57ed3abe55"} Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.525517 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"604c4e24-15e4-43e1-b08c-74d8337a2e71","Type":"ContainerDied","Data":"824266a7cc9cb5d7239bcaba8f3fa368c333aac554ec8e5c7b1cc64368838f39"} Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.560975 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:10:57 crc kubenswrapper[4764]: E0309 14:10:57.561351 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.627005 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.752382 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data\") pod \"604c4e24-15e4-43e1-b08c-74d8337a2e71\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.752469 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-scripts\") pod \"604c4e24-15e4-43e1-b08c-74d8337a2e71\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.752493 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxzkm\" (UniqueName: \"kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-kube-api-access-wxzkm\") pod \"604c4e24-15e4-43e1-b08c-74d8337a2e71\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.752593 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-etc-machine-id\") pod \"604c4e24-15e4-43e1-b08c-74d8337a2e71\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.752693 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-ceph\") pod \"604c4e24-15e4-43e1-b08c-74d8337a2e71\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.752860 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-combined-ca-bundle\") pod \"604c4e24-15e4-43e1-b08c-74d8337a2e71\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.752936 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data-custom\") pod \"604c4e24-15e4-43e1-b08c-74d8337a2e71\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.752999 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-var-lib-manila\") pod \"604c4e24-15e4-43e1-b08c-74d8337a2e71\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.753784 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "604c4e24-15e4-43e1-b08c-74d8337a2e71" (UID: "604c4e24-15e4-43e1-b08c-74d8337a2e71"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.754381 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "604c4e24-15e4-43e1-b08c-74d8337a2e71" (UID: "604c4e24-15e4-43e1-b08c-74d8337a2e71"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.760832 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-kube-api-access-wxzkm" (OuterVolumeSpecName: "kube-api-access-wxzkm") pod "604c4e24-15e4-43e1-b08c-74d8337a2e71" (UID: "604c4e24-15e4-43e1-b08c-74d8337a2e71"). InnerVolumeSpecName "kube-api-access-wxzkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.760908 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-ceph" (OuterVolumeSpecName: "ceph") pod "604c4e24-15e4-43e1-b08c-74d8337a2e71" (UID: "604c4e24-15e4-43e1-b08c-74d8337a2e71"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.761421 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-scripts" (OuterVolumeSpecName: "scripts") pod "604c4e24-15e4-43e1-b08c-74d8337a2e71" (UID: "604c4e24-15e4-43e1-b08c-74d8337a2e71"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.761531 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "604c4e24-15e4-43e1-b08c-74d8337a2e71" (UID: "604c4e24-15e4-43e1-b08c-74d8337a2e71"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.816524 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "604c4e24-15e4-43e1-b08c-74d8337a2e71" (UID: "604c4e24-15e4-43e1-b08c-74d8337a2e71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.855001 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data" (OuterVolumeSpecName: "config-data") pod "604c4e24-15e4-43e1-b08c-74d8337a2e71" (UID: "604c4e24-15e4-43e1-b08c-74d8337a2e71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.855676 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data\") pod \"604c4e24-15e4-43e1-b08c-74d8337a2e71\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " Mar 09 14:10:57 crc kubenswrapper[4764]: W0309 14:10:57.855812 4764 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/604c4e24-15e4-43e1-b08c-74d8337a2e71/volumes/kubernetes.io~secret/config-data Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.855831 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data" (OuterVolumeSpecName: "config-data") pod "604c4e24-15e4-43e1-b08c-74d8337a2e71" (UID: "604c4e24-15e4-43e1-b08c-74d8337a2e71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.858176 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.858225 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.858240 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxzkm\" (UniqueName: \"kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-kube-api-access-wxzkm\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.858254 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.858264 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.858274 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.858284 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.858294 4764 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-var-lib-manila\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.542380 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"604c4e24-15e4-43e1-b08c-74d8337a2e71","Type":"ContainerDied","Data":"99bc2f9e9539dc58eaf52226466b27a3c1c886f9a53c573b32136461c86d2a63"} Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.543769 4764 scope.go:117] "RemoveContainer" containerID="03c13ad847d2a8fd02eb6ed9a1ace4039b7b02e90605b4014e12bc57ed3abe55" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.542506 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.584083 4764 scope.go:117] "RemoveContainer" containerID="824266a7cc9cb5d7239bcaba8f3fa368c333aac554ec8e5c7b1cc64368838f39" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.591298 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.600328 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.623896 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 14:10:58 crc kubenswrapper[4764]: E0309 14:10:58.624370 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604c4e24-15e4-43e1-b08c-74d8337a2e71" containerName="probe" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.624386 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="604c4e24-15e4-43e1-b08c-74d8337a2e71" containerName="probe" Mar 09 14:10:58 crc kubenswrapper[4764]: E0309 14:10:58.624420 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604c4e24-15e4-43e1-b08c-74d8337a2e71" containerName="manila-share" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.624426 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="604c4e24-15e4-43e1-b08c-74d8337a2e71" containerName="manila-share" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.624618 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="604c4e24-15e4-43e1-b08c-74d8337a2e71" containerName="probe" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.624665 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="604c4e24-15e4-43e1-b08c-74d8337a2e71" containerName="manila-share" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.626352 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.632941 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.656231 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.679375 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2caafd00-b539-4f40-b1c6-af6957bcb458-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.679446 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-config-data\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.679474 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-scripts\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.679559 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crvf4\" (UniqueName: \"kubernetes.io/projected/2caafd00-b539-4f40-b1c6-af6957bcb458-kube-api-access-crvf4\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.679593 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.679627 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.679689 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2caafd00-b539-4f40-b1c6-af6957bcb458-ceph\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.679723 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2caafd00-b539-4f40-b1c6-af6957bcb458-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.782166 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-config-data\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.782259 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-scripts\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.782361 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crvf4\" (UniqueName: \"kubernetes.io/projected/2caafd00-b539-4f40-b1c6-af6957bcb458-kube-api-access-crvf4\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.782409 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.782453 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.783416 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2caafd00-b539-4f40-b1c6-af6957bcb458-ceph\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.783459 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2caafd00-b539-4f40-b1c6-af6957bcb458-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.783578 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2caafd00-b539-4f40-b1c6-af6957bcb458-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.783790 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2caafd00-b539-4f40-b1c6-af6957bcb458-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.783887 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2caafd00-b539-4f40-b1c6-af6957bcb458-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.787338 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-scripts\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.787455 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2caafd00-b539-4f40-b1c6-af6957bcb458-ceph\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.789028 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.796614 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-config-data\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.805545 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.805848 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crvf4\" (UniqueName: \"kubernetes.io/projected/2caafd00-b539-4f40-b1c6-af6957bcb458-kube-api-access-crvf4\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.950835 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 09 14:10:59 crc kubenswrapper[4764]: I0309 14:10:59.526265 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 14:10:59 crc kubenswrapper[4764]: I0309 14:10:59.601585 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="604c4e24-15e4-43e1-b08c-74d8337a2e71" path="/var/lib/kubelet/pods/604c4e24-15e4-43e1-b08c-74d8337a2e71/volumes" Mar 09 14:10:59 crc kubenswrapper[4764]: I0309 14:10:59.604221 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2caafd00-b539-4f40-b1c6-af6957bcb458","Type":"ContainerStarted","Data":"f7fd9e8bd13abf9c5c2923dfa95c2915e4ea26dfb1c7b93494c3dd27e85dab7e"} Mar 09 14:11:00 crc kubenswrapper[4764]: I0309 14:11:00.137180 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 09 14:11:00 crc kubenswrapper[4764]: I0309 14:11:00.610390 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2caafd00-b539-4f40-b1c6-af6957bcb458","Type":"ContainerStarted","Data":"6c988ca090ac7db06d24d10df0cb1762a1a1c8f2fe2b55bd53a6bf46db54f750"} Mar 09 14:11:01 crc kubenswrapper[4764]: I0309 14:11:01.624903 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2caafd00-b539-4f40-b1c6-af6957bcb458","Type":"ContainerStarted","Data":"2cfe387aadda7534e64755323f99bba27961b51d18e37a4c5e9977c9fdc6d4ee"} Mar 09 14:11:01 crc kubenswrapper[4764]: I0309 14:11:01.653545 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.65345204 podStartE2EDuration="3.65345204s" podCreationTimestamp="2026-03-09 14:10:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:11:01.652360871 +0000 UTC m=+3016.902532779" watchObservedRunningTime="2026-03-09 14:11:01.65345204 +0000 UTC m=+3016.903623948" Mar 09 14:11:01 crc kubenswrapper[4764]: I0309 14:11:01.839921 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.577811 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.657515 4764 generic.go:334] "Generic (PLEG): container finished" podID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerID="ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96" exitCode=137 Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.657577 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797d44c9b-wrlx7" event={"ID":"e00fc104-ec73-4190-a598-86de7ca6cfa5","Type":"ContainerDied","Data":"ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96"} Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.657617 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797d44c9b-wrlx7" event={"ID":"e00fc104-ec73-4190-a598-86de7ca6cfa5","Type":"ContainerDied","Data":"183964e4224d34ffefb68047495776b87e734441e831507181f1e3aafc498e51"} Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.657686 4764 scope.go:117] "RemoveContainer" containerID="c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.657887 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.684714 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l5bd\" (UniqueName: \"kubernetes.io/projected/e00fc104-ec73-4190-a598-86de7ca6cfa5-kube-api-access-9l5bd\") pod \"e00fc104-ec73-4190-a598-86de7ca6cfa5\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.684781 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-config-data\") pod \"e00fc104-ec73-4190-a598-86de7ca6cfa5\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.684847 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-secret-key\") pod \"e00fc104-ec73-4190-a598-86de7ca6cfa5\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.684953 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e00fc104-ec73-4190-a598-86de7ca6cfa5-logs\") pod \"e00fc104-ec73-4190-a598-86de7ca6cfa5\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.685111 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-scripts\") pod \"e00fc104-ec73-4190-a598-86de7ca6cfa5\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.685132 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-tls-certs\") pod \"e00fc104-ec73-4190-a598-86de7ca6cfa5\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.685172 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-combined-ca-bundle\") pod \"e00fc104-ec73-4190-a598-86de7ca6cfa5\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.687189 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e00fc104-ec73-4190-a598-86de7ca6cfa5-logs" (OuterVolumeSpecName: "logs") pod "e00fc104-ec73-4190-a598-86de7ca6cfa5" (UID: "e00fc104-ec73-4190-a598-86de7ca6cfa5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.692315 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e00fc104-ec73-4190-a598-86de7ca6cfa5-kube-api-access-9l5bd" (OuterVolumeSpecName: "kube-api-access-9l5bd") pod "e00fc104-ec73-4190-a598-86de7ca6cfa5" (UID: "e00fc104-ec73-4190-a598-86de7ca6cfa5"). InnerVolumeSpecName "kube-api-access-9l5bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.694690 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e00fc104-ec73-4190-a598-86de7ca6cfa5" (UID: "e00fc104-ec73-4190-a598-86de7ca6cfa5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.714302 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-scripts" (OuterVolumeSpecName: "scripts") pod "e00fc104-ec73-4190-a598-86de7ca6cfa5" (UID: "e00fc104-ec73-4190-a598-86de7ca6cfa5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.715209 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-config-data" (OuterVolumeSpecName: "config-data") pod "e00fc104-ec73-4190-a598-86de7ca6cfa5" (UID: "e00fc104-ec73-4190-a598-86de7ca6cfa5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.719932 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e00fc104-ec73-4190-a598-86de7ca6cfa5" (UID: "e00fc104-ec73-4190-a598-86de7ca6cfa5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.744218 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "e00fc104-ec73-4190-a598-86de7ca6cfa5" (UID: "e00fc104-ec73-4190-a598-86de7ca6cfa5"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.795556 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l5bd\" (UniqueName: \"kubernetes.io/projected/e00fc104-ec73-4190-a598-86de7ca6cfa5-kube-api-access-9l5bd\") on node \"crc\" DevicePath \"\"" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.795615 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.795633 4764 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.795669 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e00fc104-ec73-4190-a598-86de7ca6cfa5-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.795685 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.795695 4764 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.795713 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.822001 4764 scope.go:117] "RemoveContainer" containerID="ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.849428 4764 scope.go:117] "RemoveContainer" containerID="c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0" Mar 09 14:11:04 crc kubenswrapper[4764]: E0309 14:11:04.849888 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0\": container with ID starting with c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0 not found: ID does not exist" containerID="c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.849928 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0"} err="failed to get container status \"c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0\": rpc error: code = NotFound desc = could not find container \"c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0\": container with ID starting with c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0 not found: ID does not exist" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.849952 4764 scope.go:117] "RemoveContainer" containerID="ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96" Mar 09 14:11:04 crc kubenswrapper[4764]: E0309 14:11:04.850226 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96\": container with ID starting with ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96 not found: ID does not exist" containerID="ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.850253 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96"} err="failed to get container status \"ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96\": rpc error: code = NotFound desc = could not find container \"ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96\": container with ID starting with ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96 not found: ID does not exist" Mar 09 14:11:05 crc kubenswrapper[4764]: I0309 14:11:05.000673 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-797d44c9b-wrlx7"] Mar 09 14:11:05 crc kubenswrapper[4764]: I0309 14:11:05.010620 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-797d44c9b-wrlx7"] Mar 09 14:11:05 crc kubenswrapper[4764]: I0309 14:11:05.572999 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" path="/var/lib/kubelet/pods/e00fc104-ec73-4190-a598-86de7ca6cfa5/volumes" Mar 09 14:11:08 crc kubenswrapper[4764]: I0309 14:11:08.951842 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 09 14:11:09 crc kubenswrapper[4764]: I0309 14:11:09.929824 4764 scope.go:117] "RemoveContainer" containerID="0b92f70675fc36bf2d871b5244dabd98b74bc41c0324eee9ebebf35ece42f834" Mar 09 14:11:09 crc kubenswrapper[4764]: I0309 14:11:09.958363 4764 scope.go:117] "RemoveContainer" containerID="3b50e9b2a9f264bfcb22b787b33fb1bad8b4dde68292e87c3d306716bd5422e2" Mar 09 14:11:11 crc kubenswrapper[4764]: I0309 14:11:11.560622 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:11:11 crc kubenswrapper[4764]: E0309 14:11:11.561364 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:11:15 crc kubenswrapper[4764]: I0309 14:11:15.047793 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 14:11:20 crc kubenswrapper[4764]: I0309 14:11:20.544151 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 09 14:11:25 crc kubenswrapper[4764]: I0309 14:11:25.566405 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:11:25 crc kubenswrapper[4764]: E0309 14:11:25.568860 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:11:40 crc kubenswrapper[4764]: I0309 14:11:40.560627 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:11:40 crc kubenswrapper[4764]: E0309 14:11:40.561954 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:11:53 crc kubenswrapper[4764]: I0309 14:11:53.560116 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:11:53 crc kubenswrapper[4764]: E0309 14:11:53.561839 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.158324 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551092-qs44j"] Mar 09 14:12:00 crc kubenswrapper[4764]: E0309 14:12:00.159898 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.159930 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon" Mar 09 14:12:00 crc kubenswrapper[4764]: E0309 14:12:00.159969 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon-log" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.159981 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon-log" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.160371 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon-log" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.160392 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.161607 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551092-qs44j" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.164345 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.164633 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.169467 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.173546 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551092-qs44j"] Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.265501 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vpzk\" (UniqueName: \"kubernetes.io/projected/81a7f588-07b1-4ef1-97ee-420e944ad16b-kube-api-access-5vpzk\") pod \"auto-csr-approver-29551092-qs44j\" (UID: \"81a7f588-07b1-4ef1-97ee-420e944ad16b\") " pod="openshift-infra/auto-csr-approver-29551092-qs44j" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.367874 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vpzk\" (UniqueName: \"kubernetes.io/projected/81a7f588-07b1-4ef1-97ee-420e944ad16b-kube-api-access-5vpzk\") pod \"auto-csr-approver-29551092-qs44j\" (UID: \"81a7f588-07b1-4ef1-97ee-420e944ad16b\") " pod="openshift-infra/auto-csr-approver-29551092-qs44j" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.389878 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vpzk\" (UniqueName: \"kubernetes.io/projected/81a7f588-07b1-4ef1-97ee-420e944ad16b-kube-api-access-5vpzk\") pod \"auto-csr-approver-29551092-qs44j\" (UID: \"81a7f588-07b1-4ef1-97ee-420e944ad16b\") " pod="openshift-infra/auto-csr-approver-29551092-qs44j" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.493250 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551092-qs44j" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.974293 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551092-qs44j"] Mar 09 14:12:01 crc kubenswrapper[4764]: I0309 14:12:01.270125 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551092-qs44j" event={"ID":"81a7f588-07b1-4ef1-97ee-420e944ad16b","Type":"ContainerStarted","Data":"0757244299baed248a05242137e929ee6d2c210e50d4f4fe38f70f73594886c2"} Mar 09 14:12:03 crc kubenswrapper[4764]: I0309 14:12:03.297250 4764 generic.go:334] "Generic (PLEG): container finished" podID="81a7f588-07b1-4ef1-97ee-420e944ad16b" containerID="4148e832dbd4302f41c6dbad3eab96d625f5a7299088cd33baf2acb47b5d3267" exitCode=0 Mar 09 14:12:03 crc kubenswrapper[4764]: I0309 14:12:03.297333 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551092-qs44j" event={"ID":"81a7f588-07b1-4ef1-97ee-420e944ad16b","Type":"ContainerDied","Data":"4148e832dbd4302f41c6dbad3eab96d625f5a7299088cd33baf2acb47b5d3267"} Mar 09 14:12:04 crc kubenswrapper[4764]: I0309 14:12:04.705769 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551092-qs44j" Mar 09 14:12:04 crc kubenswrapper[4764]: I0309 14:12:04.881581 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vpzk\" (UniqueName: \"kubernetes.io/projected/81a7f588-07b1-4ef1-97ee-420e944ad16b-kube-api-access-5vpzk\") pod \"81a7f588-07b1-4ef1-97ee-420e944ad16b\" (UID: \"81a7f588-07b1-4ef1-97ee-420e944ad16b\") " Mar 09 14:12:04 crc kubenswrapper[4764]: I0309 14:12:04.889447 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81a7f588-07b1-4ef1-97ee-420e944ad16b-kube-api-access-5vpzk" (OuterVolumeSpecName: "kube-api-access-5vpzk") pod "81a7f588-07b1-4ef1-97ee-420e944ad16b" (UID: "81a7f588-07b1-4ef1-97ee-420e944ad16b"). InnerVolumeSpecName "kube-api-access-5vpzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:12:04 crc kubenswrapper[4764]: I0309 14:12:04.985786 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vpzk\" (UniqueName: \"kubernetes.io/projected/81a7f588-07b1-4ef1-97ee-420e944ad16b-kube-api-access-5vpzk\") on node \"crc\" DevicePath \"\"" Mar 09 14:12:05 crc kubenswrapper[4764]: I0309 14:12:05.325091 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551092-qs44j" event={"ID":"81a7f588-07b1-4ef1-97ee-420e944ad16b","Type":"ContainerDied","Data":"0757244299baed248a05242137e929ee6d2c210e50d4f4fe38f70f73594886c2"} Mar 09 14:12:05 crc kubenswrapper[4764]: I0309 14:12:05.325558 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0757244299baed248a05242137e929ee6d2c210e50d4f4fe38f70f73594886c2" Mar 09 14:12:05 crc kubenswrapper[4764]: I0309 14:12:05.325225 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551092-qs44j" Mar 09 14:12:05 crc kubenswrapper[4764]: I0309 14:12:05.799203 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551086-dn2gl"] Mar 09 14:12:05 crc kubenswrapper[4764]: I0309 14:12:05.812055 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551086-dn2gl"] Mar 09 14:12:06 crc kubenswrapper[4764]: I0309 14:12:06.560637 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:12:06 crc kubenswrapper[4764]: E0309 14:12:06.561222 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:12:07 crc kubenswrapper[4764]: I0309 14:12:07.570471 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55c3951f-6e8b-46f4-9332-9c5d658862e4" path="/var/lib/kubelet/pods/55c3951f-6e8b-46f4-9332-9c5d658862e4/volumes" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.758923 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 09 14:12:08 crc kubenswrapper[4764]: E0309 14:12:08.759952 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81a7f588-07b1-4ef1-97ee-420e944ad16b" containerName="oc" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.759971 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="81a7f588-07b1-4ef1-97ee-420e944ad16b" containerName="oc" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.760286 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="81a7f588-07b1-4ef1-97ee-420e944ad16b" containerName="oc" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.761283 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.765977 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-9bk55" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.766898 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.767005 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.767237 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.771616 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.878591 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9db4k\" (UniqueName: \"kubernetes.io/projected/5db22a0e-ee1a-4b26-9e49-b26644266834-kube-api-access-9db4k\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.879167 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.879203 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.879234 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.879496 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-config-data\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.879864 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.880248 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.880359 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.880585 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.982390 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.982459 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.982497 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.982546 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-config-data\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.982591 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.982722 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.982759 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.982802 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.982833 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9db4k\" (UniqueName: \"kubernetes.io/projected/5db22a0e-ee1a-4b26-9e49-b26644266834-kube-api-access-9db4k\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.983030 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.983405 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.983497 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.983817 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.984804 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-config-data\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.992516 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.993156 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.993918 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:09 crc kubenswrapper[4764]: I0309 14:12:09.006806 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9db4k\" (UniqueName: \"kubernetes.io/projected/5db22a0e-ee1a-4b26-9e49-b26644266834-kube-api-access-9db4k\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:09 crc kubenswrapper[4764]: I0309 14:12:09.013150 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:09 crc kubenswrapper[4764]: I0309 14:12:09.090999 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 14:12:09 crc kubenswrapper[4764]: I0309 14:12:09.616309 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 09 14:12:09 crc kubenswrapper[4764]: I0309 14:12:09.624156 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:12:10 crc kubenswrapper[4764]: I0309 14:12:10.176904 4764 scope.go:117] "RemoveContainer" containerID="4dfa06f7a78f4ddaf54d7700f7fde17ae280a4e75103177bb66105376bf4e6a2" Mar 09 14:12:10 crc kubenswrapper[4764]: I0309 14:12:10.377743 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5db22a0e-ee1a-4b26-9e49-b26644266834","Type":"ContainerStarted","Data":"bb1050b512fa7c7108bc1149d279ca654a7191121729b646cce9a919a3a91226"} Mar 09 14:12:18 crc kubenswrapper[4764]: I0309 14:12:18.563933 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:12:18 crc kubenswrapper[4764]: E0309 14:12:18.566322 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:12:31 crc kubenswrapper[4764]: I0309 14:12:31.566467 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:12:31 crc kubenswrapper[4764]: E0309 14:12:31.567261 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:12:40 crc kubenswrapper[4764]: E0309 14:12:40.555666 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 09 14:12:40 crc kubenswrapper[4764]: E0309 14:12:40.557966 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9db4k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(5db22a0e-ee1a-4b26-9e49-b26644266834): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:12:40 crc kubenswrapper[4764]: E0309 14:12:40.561636 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="5db22a0e-ee1a-4b26-9e49-b26644266834" Mar 09 14:12:40 crc kubenswrapper[4764]: E0309 14:12:40.754943 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="5db22a0e-ee1a-4b26-9e49-b26644266834" Mar 09 14:12:45 crc kubenswrapper[4764]: I0309 14:12:45.571538 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:12:45 crc kubenswrapper[4764]: E0309 14:12:45.573074 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:12:56 crc kubenswrapper[4764]: I0309 14:12:56.212541 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 09 14:12:57 crc kubenswrapper[4764]: I0309 14:12:57.952585 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5db22a0e-ee1a-4b26-9e49-b26644266834","Type":"ContainerStarted","Data":"88276b497758f3b5f0f6060dd845ad4b0ff499d7eeb52fb1af2c65524e2ceaba"} Mar 09 14:12:57 crc kubenswrapper[4764]: I0309 14:12:57.976546 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.391193628 podStartE2EDuration="50.976520558s" podCreationTimestamp="2026-03-09 14:12:07 +0000 UTC" firstStartedPulling="2026-03-09 14:12:09.623880542 +0000 UTC m=+3084.874052450" lastFinishedPulling="2026-03-09 14:12:56.209207472 +0000 UTC m=+3131.459379380" observedRunningTime="2026-03-09 14:12:57.973540488 +0000 UTC m=+3133.223712416" watchObservedRunningTime="2026-03-09 14:12:57.976520558 +0000 UTC m=+3133.226692476" Mar 09 14:12:59 crc kubenswrapper[4764]: I0309 14:12:59.560725 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:12:59 crc kubenswrapper[4764]: E0309 14:12:59.561564 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:13:11 crc kubenswrapper[4764]: I0309 14:13:11.560050 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:13:11 crc kubenswrapper[4764]: E0309 14:13:11.561360 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:13:24 crc kubenswrapper[4764]: I0309 14:13:24.560008 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:13:24 crc kubenswrapper[4764]: E0309 14:13:24.561180 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.319292 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9lzgr"] Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.323426 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.333850 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9lzgr"] Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.414308 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpc6l\" (UniqueName: \"kubernetes.io/projected/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-kube-api-access-tpc6l\") pod \"certified-operators-9lzgr\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.414462 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-utilities\") pod \"certified-operators-9lzgr\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.414842 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-catalog-content\") pod \"certified-operators-9lzgr\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.517905 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-utilities\") pod \"certified-operators-9lzgr\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.518144 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-catalog-content\") pod \"certified-operators-9lzgr\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.518230 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpc6l\" (UniqueName: \"kubernetes.io/projected/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-kube-api-access-tpc6l\") pod \"certified-operators-9lzgr\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.518525 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-utilities\") pod \"certified-operators-9lzgr\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.518579 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-catalog-content\") pod \"certified-operators-9lzgr\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.545320 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpc6l\" (UniqueName: \"kubernetes.io/projected/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-kube-api-access-tpc6l\") pod \"certified-operators-9lzgr\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.560227 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:13:38 crc kubenswrapper[4764]: E0309 14:13:38.560556 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.658583 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:39 crc kubenswrapper[4764]: I0309 14:13:39.232391 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9lzgr"] Mar 09 14:13:39 crc kubenswrapper[4764]: I0309 14:13:39.587781 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lzgr" event={"ID":"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db","Type":"ContainerStarted","Data":"1d30d3bafe785b61035cfd6776bbf1fbd30eae865400a81c90fd508159791920"} Mar 09 14:13:39 crc kubenswrapper[4764]: I0309 14:13:39.587848 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lzgr" event={"ID":"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db","Type":"ContainerStarted","Data":"8a4440269200c76a2fe2d685a91e0e74538a4b746197ce327947e0be241c133f"} Mar 09 14:13:40 crc kubenswrapper[4764]: I0309 14:13:40.600001 4764 generic.go:334] "Generic (PLEG): container finished" podID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerID="1d30d3bafe785b61035cfd6776bbf1fbd30eae865400a81c90fd508159791920" exitCode=0 Mar 09 14:13:40 crc kubenswrapper[4764]: I0309 14:13:40.600085 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lzgr" event={"ID":"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db","Type":"ContainerDied","Data":"1d30d3bafe785b61035cfd6776bbf1fbd30eae865400a81c90fd508159791920"} Mar 09 14:13:41 crc kubenswrapper[4764]: I0309 14:13:41.615593 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lzgr" event={"ID":"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db","Type":"ContainerStarted","Data":"a83e9cbe22286a479af2a92b6e16b2800ac926082aaf40b98e4defaf7a33610b"} Mar 09 14:13:44 crc kubenswrapper[4764]: I0309 14:13:44.652767 4764 generic.go:334] "Generic (PLEG): container finished" podID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerID="a83e9cbe22286a479af2a92b6e16b2800ac926082aaf40b98e4defaf7a33610b" exitCode=0 Mar 09 14:13:44 crc kubenswrapper[4764]: I0309 14:13:44.652813 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lzgr" event={"ID":"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db","Type":"ContainerDied","Data":"a83e9cbe22286a479af2a92b6e16b2800ac926082aaf40b98e4defaf7a33610b"} Mar 09 14:13:45 crc kubenswrapper[4764]: I0309 14:13:45.668348 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lzgr" event={"ID":"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db","Type":"ContainerStarted","Data":"25dd59f6d05d4fb8f17a37a81bd52fddb49fdd1d34f6ac322a9adb36613a0c6b"} Mar 09 14:13:45 crc kubenswrapper[4764]: I0309 14:13:45.694726 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9lzgr" podStartSLOduration=2.155137017 podStartE2EDuration="7.694689691s" podCreationTimestamp="2026-03-09 14:13:38 +0000 UTC" firstStartedPulling="2026-03-09 14:13:39.591257981 +0000 UTC m=+3174.841429889" lastFinishedPulling="2026-03-09 14:13:45.130810615 +0000 UTC m=+3180.380982563" observedRunningTime="2026-03-09 14:13:45.691210038 +0000 UTC m=+3180.941381956" watchObservedRunningTime="2026-03-09 14:13:45.694689691 +0000 UTC m=+3180.944861669" Mar 09 14:13:48 crc kubenswrapper[4764]: I0309 14:13:48.659295 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:48 crc kubenswrapper[4764]: I0309 14:13:48.660550 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:48 crc kubenswrapper[4764]: I0309 14:13:48.750392 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:52 crc kubenswrapper[4764]: I0309 14:13:52.559526 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:13:52 crc kubenswrapper[4764]: E0309 14:13:52.560540 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:13:58 crc kubenswrapper[4764]: I0309 14:13:58.716770 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.386867 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9lzgr"] Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.387168 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9lzgr" podUID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerName="registry-server" containerID="cri-o://25dd59f6d05d4fb8f17a37a81bd52fddb49fdd1d34f6ac322a9adb36613a0c6b" gracePeriod=2 Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.834045 4764 generic.go:334] "Generic (PLEG): container finished" podID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerID="25dd59f6d05d4fb8f17a37a81bd52fddb49fdd1d34f6ac322a9adb36613a0c6b" exitCode=0 Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.834598 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lzgr" event={"ID":"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db","Type":"ContainerDied","Data":"25dd59f6d05d4fb8f17a37a81bd52fddb49fdd1d34f6ac322a9adb36613a0c6b"} Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.834637 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lzgr" event={"ID":"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db","Type":"ContainerDied","Data":"8a4440269200c76a2fe2d685a91e0e74538a4b746197ce327947e0be241c133f"} Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.834738 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a4440269200c76a2fe2d685a91e0e74538a4b746197ce327947e0be241c133f" Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.879136 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.979424 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-utilities\") pod \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.979673 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpc6l\" (UniqueName: \"kubernetes.io/projected/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-kube-api-access-tpc6l\") pod \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.979830 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-catalog-content\") pod \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.980438 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-utilities" (OuterVolumeSpecName: "utilities") pod "8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" (UID: "8414d1b3-79f4-4eb4-b7fc-e85caf18e1db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.981599 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.988414 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-kube-api-access-tpc6l" (OuterVolumeSpecName: "kube-api-access-tpc6l") pod "8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" (UID: "8414d1b3-79f4-4eb4-b7fc-e85caf18e1db"). InnerVolumeSpecName "kube-api-access-tpc6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.063857 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" (UID: "8414d1b3-79f4-4eb4-b7fc-e85caf18e1db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.084454 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpc6l\" (UniqueName: \"kubernetes.io/projected/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-kube-api-access-tpc6l\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.084497 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.155814 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551094-w5qt4"] Mar 09 14:14:00 crc kubenswrapper[4764]: E0309 14:14:00.156428 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerName="extract-utilities" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.156444 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerName="extract-utilities" Mar 09 14:14:00 crc kubenswrapper[4764]: E0309 14:14:00.156460 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerName="registry-server" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.156466 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerName="registry-server" Mar 09 14:14:00 crc kubenswrapper[4764]: E0309 14:14:00.156510 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerName="extract-content" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.156516 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerName="extract-content" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.156800 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerName="registry-server" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.157826 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551094-w5qt4" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.161349 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.161410 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.161436 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.170780 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551094-w5qt4"] Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.289042 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrvgc\" (UniqueName: \"kubernetes.io/projected/d7d72c38-b071-4fcb-89b4-935542a1943e-kube-api-access-vrvgc\") pod \"auto-csr-approver-29551094-w5qt4\" (UID: \"d7d72c38-b071-4fcb-89b4-935542a1943e\") " pod="openshift-infra/auto-csr-approver-29551094-w5qt4" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.390863 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrvgc\" (UniqueName: \"kubernetes.io/projected/d7d72c38-b071-4fcb-89b4-935542a1943e-kube-api-access-vrvgc\") pod \"auto-csr-approver-29551094-w5qt4\" (UID: \"d7d72c38-b071-4fcb-89b4-935542a1943e\") " pod="openshift-infra/auto-csr-approver-29551094-w5qt4" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.411592 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrvgc\" (UniqueName: \"kubernetes.io/projected/d7d72c38-b071-4fcb-89b4-935542a1943e-kube-api-access-vrvgc\") pod \"auto-csr-approver-29551094-w5qt4\" (UID: \"d7d72c38-b071-4fcb-89b4-935542a1943e\") " pod="openshift-infra/auto-csr-approver-29551094-w5qt4" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.485439 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551094-w5qt4" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.843146 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.892539 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9lzgr"] Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.901749 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9lzgr"] Mar 09 14:14:00 crc kubenswrapper[4764]: E0309 14:14:00.987930 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8414d1b3_79f4_4eb4_b7fc_e85caf18e1db.slice/crio-8a4440269200c76a2fe2d685a91e0e74538a4b746197ce327947e0be241c133f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8414d1b3_79f4_4eb4_b7fc_e85caf18e1db.slice\": RecentStats: unable to find data in memory cache]" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.996804 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551094-w5qt4"] Mar 09 14:14:01 crc kubenswrapper[4764]: W0309 14:14:01.011424 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7d72c38_b071_4fcb_89b4_935542a1943e.slice/crio-68241c662e00c69533831193ee003a89b7f9795f5fb6f4f7cd8b120dc63d0036 WatchSource:0}: Error finding container 68241c662e00c69533831193ee003a89b7f9795f5fb6f4f7cd8b120dc63d0036: Status 404 returned error can't find the container with id 68241c662e00c69533831193ee003a89b7f9795f5fb6f4f7cd8b120dc63d0036 Mar 09 14:14:01 crc kubenswrapper[4764]: I0309 14:14:01.574478 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" path="/var/lib/kubelet/pods/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db/volumes" Mar 09 14:14:01 crc kubenswrapper[4764]: I0309 14:14:01.859988 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551094-w5qt4" event={"ID":"d7d72c38-b071-4fcb-89b4-935542a1943e","Type":"ContainerStarted","Data":"68241c662e00c69533831193ee003a89b7f9795f5fb6f4f7cd8b120dc63d0036"} Mar 09 14:14:02 crc kubenswrapper[4764]: I0309 14:14:02.873881 4764 generic.go:334] "Generic (PLEG): container finished" podID="d7d72c38-b071-4fcb-89b4-935542a1943e" containerID="865365e28885b4a1f3c0039c8f24992ea422c88be0ea3e7581c981c450049a6c" exitCode=0 Mar 09 14:14:02 crc kubenswrapper[4764]: I0309 14:14:02.873975 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551094-w5qt4" event={"ID":"d7d72c38-b071-4fcb-89b4-935542a1943e","Type":"ContainerDied","Data":"865365e28885b4a1f3c0039c8f24992ea422c88be0ea3e7581c981c450049a6c"} Mar 09 14:14:04 crc kubenswrapper[4764]: I0309 14:14:04.282236 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551094-w5qt4" Mar 09 14:14:04 crc kubenswrapper[4764]: I0309 14:14:04.405557 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrvgc\" (UniqueName: \"kubernetes.io/projected/d7d72c38-b071-4fcb-89b4-935542a1943e-kube-api-access-vrvgc\") pod \"d7d72c38-b071-4fcb-89b4-935542a1943e\" (UID: \"d7d72c38-b071-4fcb-89b4-935542a1943e\") " Mar 09 14:14:04 crc kubenswrapper[4764]: I0309 14:14:04.414845 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7d72c38-b071-4fcb-89b4-935542a1943e-kube-api-access-vrvgc" (OuterVolumeSpecName: "kube-api-access-vrvgc") pod "d7d72c38-b071-4fcb-89b4-935542a1943e" (UID: "d7d72c38-b071-4fcb-89b4-935542a1943e"). InnerVolumeSpecName "kube-api-access-vrvgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:14:04 crc kubenswrapper[4764]: I0309 14:14:04.510340 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrvgc\" (UniqueName: \"kubernetes.io/projected/d7d72c38-b071-4fcb-89b4-935542a1943e-kube-api-access-vrvgc\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:04 crc kubenswrapper[4764]: I0309 14:14:04.896469 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551094-w5qt4" event={"ID":"d7d72c38-b071-4fcb-89b4-935542a1943e","Type":"ContainerDied","Data":"68241c662e00c69533831193ee003a89b7f9795f5fb6f4f7cd8b120dc63d0036"} Mar 09 14:14:04 crc kubenswrapper[4764]: I0309 14:14:04.896531 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68241c662e00c69533831193ee003a89b7f9795f5fb6f4f7cd8b120dc63d0036" Mar 09 14:14:04 crc kubenswrapper[4764]: I0309 14:14:04.896582 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551094-w5qt4" Mar 09 14:14:05 crc kubenswrapper[4764]: I0309 14:14:05.361097 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551088-wtbvc"] Mar 09 14:14:05 crc kubenswrapper[4764]: I0309 14:14:05.373997 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551088-wtbvc"] Mar 09 14:14:05 crc kubenswrapper[4764]: I0309 14:14:05.567978 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:14:05 crc kubenswrapper[4764]: E0309 14:14:05.570017 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:14:05 crc kubenswrapper[4764]: I0309 14:14:05.572780 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="902ae1d9-a43c-46c6-a492-10ee0242e721" path="/var/lib/kubelet/pods/902ae1d9-a43c-46c6-a492-10ee0242e721/volumes" Mar 09 14:14:10 crc kubenswrapper[4764]: I0309 14:14:10.298012 4764 scope.go:117] "RemoveContainer" containerID="20854488fc265f9f6a154273e0d45920e3a458753547063958f0c25388b08a64" Mar 09 14:14:16 crc kubenswrapper[4764]: I0309 14:14:16.559887 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:14:16 crc kubenswrapper[4764]: E0309 14:14:16.560791 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:14:27 crc kubenswrapper[4764]: I0309 14:14:27.560960 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:14:27 crc kubenswrapper[4764]: E0309 14:14:27.562138 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:14:39 crc kubenswrapper[4764]: I0309 14:14:39.560431 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:14:39 crc kubenswrapper[4764]: E0309 14:14:39.561690 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:14:54 crc kubenswrapper[4764]: I0309 14:14:54.560425 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:14:54 crc kubenswrapper[4764]: E0309 14:14:54.561430 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.158130 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78"] Mar 09 14:15:00 crc kubenswrapper[4764]: E0309 14:15:00.159576 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d72c38-b071-4fcb-89b4-935542a1943e" containerName="oc" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.159596 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d72c38-b071-4fcb-89b4-935542a1943e" containerName="oc" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.159865 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d72c38-b071-4fcb-89b4-935542a1943e" containerName="oc" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.160693 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.164149 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.168692 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.212225 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78"] Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.293118 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n45lc\" (UniqueName: \"kubernetes.io/projected/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-kube-api-access-n45lc\") pod \"collect-profiles-29551095-z2m78\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.293198 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-config-volume\") pod \"collect-profiles-29551095-z2m78\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.293315 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-secret-volume\") pod \"collect-profiles-29551095-z2m78\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.395183 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-config-volume\") pod \"collect-profiles-29551095-z2m78\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.395417 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-secret-volume\") pod \"collect-profiles-29551095-z2m78\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.395600 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n45lc\" (UniqueName: \"kubernetes.io/projected/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-kube-api-access-n45lc\") pod \"collect-profiles-29551095-z2m78\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.397234 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-config-volume\") pod \"collect-profiles-29551095-z2m78\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.405164 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-secret-volume\") pod \"collect-profiles-29551095-z2m78\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.416046 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n45lc\" (UniqueName: \"kubernetes.io/projected/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-kube-api-access-n45lc\") pod \"collect-profiles-29551095-z2m78\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.489547 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.987860 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78"] Mar 09 14:15:01 crc kubenswrapper[4764]: I0309 14:15:01.486390 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" event={"ID":"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0","Type":"ContainerStarted","Data":"ab97d625c108341b13c2db5e1056def5231f61fcd681f959a64e961f2a15864d"} Mar 09 14:15:01 crc kubenswrapper[4764]: I0309 14:15:01.486844 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" event={"ID":"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0","Type":"ContainerStarted","Data":"2a4bac2ac8d89facbed3155024eab8ea4ab9e1aa36b9e90e3e95aabe36df67a2"} Mar 09 14:15:02 crc kubenswrapper[4764]: I0309 14:15:02.496834 4764 generic.go:334] "Generic (PLEG): container finished" podID="697f22c2-daeb-4ae7-a0da-4e80bfad4cb0" containerID="ab97d625c108341b13c2db5e1056def5231f61fcd681f959a64e961f2a15864d" exitCode=0 Mar 09 14:15:02 crc kubenswrapper[4764]: I0309 14:15:02.496929 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" event={"ID":"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0","Type":"ContainerDied","Data":"ab97d625c108341b13c2db5e1056def5231f61fcd681f959a64e961f2a15864d"} Mar 09 14:15:03 crc kubenswrapper[4764]: I0309 14:15:03.877374 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.012036 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n45lc\" (UniqueName: \"kubernetes.io/projected/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-kube-api-access-n45lc\") pod \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.012142 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-secret-volume\") pod \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.012323 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-config-volume\") pod \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.013884 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-config-volume" (OuterVolumeSpecName: "config-volume") pod "697f22c2-daeb-4ae7-a0da-4e80bfad4cb0" (UID: "697f22c2-daeb-4ae7-a0da-4e80bfad4cb0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.020117 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-kube-api-access-n45lc" (OuterVolumeSpecName: "kube-api-access-n45lc") pod "697f22c2-daeb-4ae7-a0da-4e80bfad4cb0" (UID: "697f22c2-daeb-4ae7-a0da-4e80bfad4cb0"). InnerVolumeSpecName "kube-api-access-n45lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.020837 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "697f22c2-daeb-4ae7-a0da-4e80bfad4cb0" (UID: "697f22c2-daeb-4ae7-a0da-4e80bfad4cb0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.115621 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.115950 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n45lc\" (UniqueName: \"kubernetes.io/projected/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-kube-api-access-n45lc\") on node \"crc\" DevicePath \"\"" Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.116044 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.524377 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" event={"ID":"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0","Type":"ContainerDied","Data":"2a4bac2ac8d89facbed3155024eab8ea4ab9e1aa36b9e90e3e95aabe36df67a2"} Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.525189 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a4bac2ac8d89facbed3155024eab8ea4ab9e1aa36b9e90e3e95aabe36df67a2" Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.524616 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.973761 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp"] Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.982162 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp"] Mar 09 14:15:05 crc kubenswrapper[4764]: I0309 14:15:05.579338 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a7784d6-384a-426a-8c7f-17738461c327" path="/var/lib/kubelet/pods/1a7784d6-384a-426a-8c7f-17738461c327/volumes" Mar 09 14:15:06 crc kubenswrapper[4764]: I0309 14:15:06.560494 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:15:06 crc kubenswrapper[4764]: E0309 14:15:06.561097 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:15:10 crc kubenswrapper[4764]: I0309 14:15:10.397893 4764 scope.go:117] "RemoveContainer" containerID="aa846508ebc812b1aaea2ee2e48b6017bd527c31a06deb61dd1001786fb1b811" Mar 09 14:15:20 crc kubenswrapper[4764]: I0309 14:15:20.561571 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:15:20 crc kubenswrapper[4764]: E0309 14:15:20.565879 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:15:34 crc kubenswrapper[4764]: I0309 14:15:34.560717 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:15:34 crc kubenswrapper[4764]: I0309 14:15:34.857511 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"e0fd7c06957b904af3795728b54332eb3bc1c0b6a8df56e1a4bc3599bdedd8b5"} Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.195778 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551096-9bl56"] Mar 09 14:16:00 crc kubenswrapper[4764]: E0309 14:16:00.197078 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="697f22c2-daeb-4ae7-a0da-4e80bfad4cb0" containerName="collect-profiles" Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.197098 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="697f22c2-daeb-4ae7-a0da-4e80bfad4cb0" containerName="collect-profiles" Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.197378 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="697f22c2-daeb-4ae7-a0da-4e80bfad4cb0" containerName="collect-profiles" Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.198262 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551096-9bl56" Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.204028 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.204246 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.205059 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.212716 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551096-9bl56"] Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.330179 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pkrd\" (UniqueName: \"kubernetes.io/projected/40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3-kube-api-access-2pkrd\") pod \"auto-csr-approver-29551096-9bl56\" (UID: \"40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3\") " pod="openshift-infra/auto-csr-approver-29551096-9bl56" Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.433116 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pkrd\" (UniqueName: \"kubernetes.io/projected/40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3-kube-api-access-2pkrd\") pod \"auto-csr-approver-29551096-9bl56\" (UID: \"40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3\") " pod="openshift-infra/auto-csr-approver-29551096-9bl56" Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.458383 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pkrd\" (UniqueName: \"kubernetes.io/projected/40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3-kube-api-access-2pkrd\") pod \"auto-csr-approver-29551096-9bl56\" (UID: \"40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3\") " pod="openshift-infra/auto-csr-approver-29551096-9bl56" Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.522264 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551096-9bl56" Mar 09 14:16:01 crc kubenswrapper[4764]: I0309 14:16:01.057706 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551096-9bl56"] Mar 09 14:16:01 crc kubenswrapper[4764]: I0309 14:16:01.113047 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551096-9bl56" event={"ID":"40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3","Type":"ContainerStarted","Data":"ec35db2146bde758526b41276fc340a90033203c67b677f74b8aef8ab3a243b1"} Mar 09 14:16:03 crc kubenswrapper[4764]: I0309 14:16:03.134847 4764 generic.go:334] "Generic (PLEG): container finished" podID="40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3" containerID="ec447d2d121a3c0de24bcb688dbed1ad219802d16c0a98f40138fd4ac8dc4b32" exitCode=0 Mar 09 14:16:03 crc kubenswrapper[4764]: I0309 14:16:03.134936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551096-9bl56" event={"ID":"40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3","Type":"ContainerDied","Data":"ec447d2d121a3c0de24bcb688dbed1ad219802d16c0a98f40138fd4ac8dc4b32"} Mar 09 14:16:04 crc kubenswrapper[4764]: I0309 14:16:04.657126 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551096-9bl56" Mar 09 14:16:04 crc kubenswrapper[4764]: I0309 14:16:04.736272 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pkrd\" (UniqueName: \"kubernetes.io/projected/40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3-kube-api-access-2pkrd\") pod \"40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3\" (UID: \"40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3\") " Mar 09 14:16:04 crc kubenswrapper[4764]: I0309 14:16:04.744387 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3-kube-api-access-2pkrd" (OuterVolumeSpecName: "kube-api-access-2pkrd") pod "40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3" (UID: "40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3"). InnerVolumeSpecName "kube-api-access-2pkrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:16:04 crc kubenswrapper[4764]: I0309 14:16:04.840040 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pkrd\" (UniqueName: \"kubernetes.io/projected/40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3-kube-api-access-2pkrd\") on node \"crc\" DevicePath \"\"" Mar 09 14:16:05 crc kubenswrapper[4764]: I0309 14:16:05.176200 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551096-9bl56" event={"ID":"40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3","Type":"ContainerDied","Data":"ec35db2146bde758526b41276fc340a90033203c67b677f74b8aef8ab3a243b1"} Mar 09 14:16:05 crc kubenswrapper[4764]: I0309 14:16:05.176253 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec35db2146bde758526b41276fc340a90033203c67b677f74b8aef8ab3a243b1" Mar 09 14:16:05 crc kubenswrapper[4764]: I0309 14:16:05.176266 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551096-9bl56" Mar 09 14:16:05 crc kubenswrapper[4764]: I0309 14:16:05.758766 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551090-8wdlp"] Mar 09 14:16:05 crc kubenswrapper[4764]: I0309 14:16:05.776211 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551090-8wdlp"] Mar 09 14:16:07 crc kubenswrapper[4764]: I0309 14:16:07.571599 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e" path="/var/lib/kubelet/pods/4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e/volumes" Mar 09 14:16:10 crc kubenswrapper[4764]: I0309 14:16:10.495566 4764 scope.go:117] "RemoveContainer" containerID="30776e46d343f3c10f72753cb59ea338812b698d41259546e8c5f506b57f2e53" Mar 09 14:16:10 crc kubenswrapper[4764]: I0309 14:16:10.553759 4764 scope.go:117] "RemoveContainer" containerID="c50fe2a841af25f201b2df55f17fb4354fb702de4d79adbf221b5ea06463110b" Mar 09 14:16:10 crc kubenswrapper[4764]: I0309 14:16:10.609802 4764 scope.go:117] "RemoveContainer" containerID="ecd62f1b0eb5226209d3ecf6887c5a0f2d0feaa519397c875dd114ffc89d8b3b" Mar 09 14:17:58 crc kubenswrapper[4764]: I0309 14:17:58.369991 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:17:58 crc kubenswrapper[4764]: I0309 14:17:58.370737 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.161501 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551098-7clls"] Mar 09 14:18:00 crc kubenswrapper[4764]: E0309 14:18:00.162614 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3" containerName="oc" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.162631 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3" containerName="oc" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.162899 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3" containerName="oc" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.163851 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551098-7clls" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.166944 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.167027 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.167378 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.189907 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551098-7clls"] Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.308008 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5sck\" (UniqueName: \"kubernetes.io/projected/e86714ea-a59a-4955-b4a5-038ce0ce7bf6-kube-api-access-v5sck\") pod \"auto-csr-approver-29551098-7clls\" (UID: \"e86714ea-a59a-4955-b4a5-038ce0ce7bf6\") " pod="openshift-infra/auto-csr-approver-29551098-7clls" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.411438 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5sck\" (UniqueName: \"kubernetes.io/projected/e86714ea-a59a-4955-b4a5-038ce0ce7bf6-kube-api-access-v5sck\") pod \"auto-csr-approver-29551098-7clls\" (UID: \"e86714ea-a59a-4955-b4a5-038ce0ce7bf6\") " pod="openshift-infra/auto-csr-approver-29551098-7clls" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.451805 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5sck\" (UniqueName: \"kubernetes.io/projected/e86714ea-a59a-4955-b4a5-038ce0ce7bf6-kube-api-access-v5sck\") pod \"auto-csr-approver-29551098-7clls\" (UID: \"e86714ea-a59a-4955-b4a5-038ce0ce7bf6\") " pod="openshift-infra/auto-csr-approver-29551098-7clls" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.490684 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551098-7clls" Mar 09 14:18:01 crc kubenswrapper[4764]: I0309 14:18:01.009981 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551098-7clls"] Mar 09 14:18:01 crc kubenswrapper[4764]: I0309 14:18:01.041241 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:18:01 crc kubenswrapper[4764]: I0309 14:18:01.104125 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551098-7clls" event={"ID":"e86714ea-a59a-4955-b4a5-038ce0ce7bf6","Type":"ContainerStarted","Data":"a22ac519a9a0cc5b0000c00fd0bf97e4d6fdb2c0405f4eabe925989232e16a69"} Mar 09 14:18:03 crc kubenswrapper[4764]: I0309 14:18:03.125905 4764 generic.go:334] "Generic (PLEG): container finished" podID="e86714ea-a59a-4955-b4a5-038ce0ce7bf6" containerID="c9de23699d45475689268b26fadafce561fa4c6da1922b49b79542286bd5c95d" exitCode=0 Mar 09 14:18:03 crc kubenswrapper[4764]: I0309 14:18:03.126515 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551098-7clls" event={"ID":"e86714ea-a59a-4955-b4a5-038ce0ce7bf6","Type":"ContainerDied","Data":"c9de23699d45475689268b26fadafce561fa4c6da1922b49b79542286bd5c95d"} Mar 09 14:18:04 crc kubenswrapper[4764]: I0309 14:18:04.640579 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551098-7clls" Mar 09 14:18:04 crc kubenswrapper[4764]: I0309 14:18:04.837195 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5sck\" (UniqueName: \"kubernetes.io/projected/e86714ea-a59a-4955-b4a5-038ce0ce7bf6-kube-api-access-v5sck\") pod \"e86714ea-a59a-4955-b4a5-038ce0ce7bf6\" (UID: \"e86714ea-a59a-4955-b4a5-038ce0ce7bf6\") " Mar 09 14:18:04 crc kubenswrapper[4764]: I0309 14:18:04.843736 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e86714ea-a59a-4955-b4a5-038ce0ce7bf6-kube-api-access-v5sck" (OuterVolumeSpecName: "kube-api-access-v5sck") pod "e86714ea-a59a-4955-b4a5-038ce0ce7bf6" (UID: "e86714ea-a59a-4955-b4a5-038ce0ce7bf6"). InnerVolumeSpecName "kube-api-access-v5sck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:18:04 crc kubenswrapper[4764]: I0309 14:18:04.940118 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5sck\" (UniqueName: \"kubernetes.io/projected/e86714ea-a59a-4955-b4a5-038ce0ce7bf6-kube-api-access-v5sck\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:05 crc kubenswrapper[4764]: I0309 14:18:05.148600 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551098-7clls" event={"ID":"e86714ea-a59a-4955-b4a5-038ce0ce7bf6","Type":"ContainerDied","Data":"a22ac519a9a0cc5b0000c00fd0bf97e4d6fdb2c0405f4eabe925989232e16a69"} Mar 09 14:18:05 crc kubenswrapper[4764]: I0309 14:18:05.149827 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a22ac519a9a0cc5b0000c00fd0bf97e4d6fdb2c0405f4eabe925989232e16a69" Mar 09 14:18:05 crc kubenswrapper[4764]: I0309 14:18:05.149967 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551098-7clls" Mar 09 14:18:05 crc kubenswrapper[4764]: I0309 14:18:05.748726 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551092-qs44j"] Mar 09 14:18:05 crc kubenswrapper[4764]: I0309 14:18:05.754455 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551092-qs44j"] Mar 09 14:18:07 crc kubenswrapper[4764]: I0309 14:18:07.572009 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81a7f588-07b1-4ef1-97ee-420e944ad16b" path="/var/lib/kubelet/pods/81a7f588-07b1-4ef1-97ee-420e944ad16b/volumes" Mar 09 14:18:10 crc kubenswrapper[4764]: I0309 14:18:10.753930 4764 scope.go:117] "RemoveContainer" containerID="4148e832dbd4302f41c6dbad3eab96d625f5a7299088cd33baf2acb47b5d3267" Mar 09 14:18:28 crc kubenswrapper[4764]: I0309 14:18:28.370356 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:18:28 crc kubenswrapper[4764]: I0309 14:18:28.373037 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.041518 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qrmxw"] Mar 09 14:18:32 crc kubenswrapper[4764]: E0309 14:18:32.043917 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86714ea-a59a-4955-b4a5-038ce0ce7bf6" containerName="oc" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.043942 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86714ea-a59a-4955-b4a5-038ce0ce7bf6" containerName="oc" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.044346 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e86714ea-a59a-4955-b4a5-038ce0ce7bf6" containerName="oc" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.046935 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.059804 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qrmxw"] Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.172818 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-utilities\") pod \"community-operators-qrmxw\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.172896 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-catalog-content\") pod \"community-operators-qrmxw\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.172934 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9zcl\" (UniqueName: \"kubernetes.io/projected/e756ebeb-906e-4f9e-8abc-d254ffed03b7-kube-api-access-d9zcl\") pod \"community-operators-qrmxw\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.275859 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-utilities\") pod \"community-operators-qrmxw\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.275958 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-catalog-content\") pod \"community-operators-qrmxw\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.276021 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9zcl\" (UniqueName: \"kubernetes.io/projected/e756ebeb-906e-4f9e-8abc-d254ffed03b7-kube-api-access-d9zcl\") pod \"community-operators-qrmxw\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.276749 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-utilities\") pod \"community-operators-qrmxw\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.277065 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-catalog-content\") pod \"community-operators-qrmxw\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.297290 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9zcl\" (UniqueName: \"kubernetes.io/projected/e756ebeb-906e-4f9e-8abc-d254ffed03b7-kube-api-access-d9zcl\") pod \"community-operators-qrmxw\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.377404 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:33 crc kubenswrapper[4764]: I0309 14:18:33.069372 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qrmxw"] Mar 09 14:18:33 crc kubenswrapper[4764]: I0309 14:18:33.420905 4764 generic.go:334] "Generic (PLEG): container finished" podID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerID="b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c" exitCode=0 Mar 09 14:18:33 crc kubenswrapper[4764]: I0309 14:18:33.421027 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrmxw" event={"ID":"e756ebeb-906e-4f9e-8abc-d254ffed03b7","Type":"ContainerDied","Data":"b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c"} Mar 09 14:18:33 crc kubenswrapper[4764]: I0309 14:18:33.421303 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrmxw" event={"ID":"e756ebeb-906e-4f9e-8abc-d254ffed03b7","Type":"ContainerStarted","Data":"262fcd6db8b7bee48513ef2e76ec7c52b7eea78484a12e2ae46265c79cd504ba"} Mar 09 14:18:34 crc kubenswrapper[4764]: I0309 14:18:34.435868 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrmxw" event={"ID":"e756ebeb-906e-4f9e-8abc-d254ffed03b7","Type":"ContainerStarted","Data":"ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134"} Mar 09 14:18:35 crc kubenswrapper[4764]: I0309 14:18:35.448208 4764 generic.go:334] "Generic (PLEG): container finished" podID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerID="ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134" exitCode=0 Mar 09 14:18:35 crc kubenswrapper[4764]: I0309 14:18:35.448423 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrmxw" event={"ID":"e756ebeb-906e-4f9e-8abc-d254ffed03b7","Type":"ContainerDied","Data":"ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134"} Mar 09 14:18:36 crc kubenswrapper[4764]: I0309 14:18:36.458968 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrmxw" event={"ID":"e756ebeb-906e-4f9e-8abc-d254ffed03b7","Type":"ContainerStarted","Data":"4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85"} Mar 09 14:18:42 crc kubenswrapper[4764]: I0309 14:18:42.378633 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:42 crc kubenswrapper[4764]: I0309 14:18:42.379699 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:42 crc kubenswrapper[4764]: I0309 14:18:42.447443 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:42 crc kubenswrapper[4764]: I0309 14:18:42.472451 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qrmxw" podStartSLOduration=8.057431336 podStartE2EDuration="10.472413701s" podCreationTimestamp="2026-03-09 14:18:32 +0000 UTC" firstStartedPulling="2026-03-09 14:18:33.422914774 +0000 UTC m=+3468.673086672" lastFinishedPulling="2026-03-09 14:18:35.837897129 +0000 UTC m=+3471.088069037" observedRunningTime="2026-03-09 14:18:36.495860996 +0000 UTC m=+3471.746032904" watchObservedRunningTime="2026-03-09 14:18:42.472413701 +0000 UTC m=+3477.722585619" Mar 09 14:18:42 crc kubenswrapper[4764]: I0309 14:18:42.583111 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:42 crc kubenswrapper[4764]: I0309 14:18:42.686945 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qrmxw"] Mar 09 14:18:44 crc kubenswrapper[4764]: I0309 14:18:44.550133 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qrmxw" podUID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerName="registry-server" containerID="cri-o://4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85" gracePeriod=2 Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.265294 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.316763 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9zcl\" (UniqueName: \"kubernetes.io/projected/e756ebeb-906e-4f9e-8abc-d254ffed03b7-kube-api-access-d9zcl\") pod \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.317133 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-catalog-content\") pod \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.317221 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-utilities\") pod \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.319824 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-utilities" (OuterVolumeSpecName: "utilities") pod "e756ebeb-906e-4f9e-8abc-d254ffed03b7" (UID: "e756ebeb-906e-4f9e-8abc-d254ffed03b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.324486 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e756ebeb-906e-4f9e-8abc-d254ffed03b7-kube-api-access-d9zcl" (OuterVolumeSpecName: "kube-api-access-d9zcl") pod "e756ebeb-906e-4f9e-8abc-d254ffed03b7" (UID: "e756ebeb-906e-4f9e-8abc-d254ffed03b7"). InnerVolumeSpecName "kube-api-access-d9zcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.371889 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e756ebeb-906e-4f9e-8abc-d254ffed03b7" (UID: "e756ebeb-906e-4f9e-8abc-d254ffed03b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.420803 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9zcl\" (UniqueName: \"kubernetes.io/projected/e756ebeb-906e-4f9e-8abc-d254ffed03b7-kube-api-access-d9zcl\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.420851 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.420863 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.582255 4764 generic.go:334] "Generic (PLEG): container finished" podID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerID="4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85" exitCode=0 Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.582365 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.583962 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrmxw" event={"ID":"e756ebeb-906e-4f9e-8abc-d254ffed03b7","Type":"ContainerDied","Data":"4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85"} Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.584008 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrmxw" event={"ID":"e756ebeb-906e-4f9e-8abc-d254ffed03b7","Type":"ContainerDied","Data":"262fcd6db8b7bee48513ef2e76ec7c52b7eea78484a12e2ae46265c79cd504ba"} Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.584696 4764 scope.go:117] "RemoveContainer" containerID="4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.639430 4764 scope.go:117] "RemoveContainer" containerID="ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.655092 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qrmxw"] Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.667231 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qrmxw"] Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.668761 4764 scope.go:117] "RemoveContainer" containerID="b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.715890 4764 scope.go:117] "RemoveContainer" containerID="4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85" Mar 09 14:18:45 crc kubenswrapper[4764]: E0309 14:18:45.716827 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85\": container with ID starting with 4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85 not found: ID does not exist" containerID="4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.716870 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85"} err="failed to get container status \"4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85\": rpc error: code = NotFound desc = could not find container \"4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85\": container with ID starting with 4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85 not found: ID does not exist" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.716898 4764 scope.go:117] "RemoveContainer" containerID="ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134" Mar 09 14:18:45 crc kubenswrapper[4764]: E0309 14:18:45.717222 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134\": container with ID starting with ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134 not found: ID does not exist" containerID="ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.717303 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134"} err="failed to get container status \"ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134\": rpc error: code = NotFound desc = could not find container \"ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134\": container with ID starting with ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134 not found: ID does not exist" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.717372 4764 scope.go:117] "RemoveContainer" containerID="b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c" Mar 09 14:18:45 crc kubenswrapper[4764]: E0309 14:18:45.718014 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c\": container with ID starting with b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c not found: ID does not exist" containerID="b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.718042 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c"} err="failed to get container status \"b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c\": rpc error: code = NotFound desc = could not find container \"b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c\": container with ID starting with b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c not found: ID does not exist" Mar 09 14:18:47 crc kubenswrapper[4764]: I0309 14:18:47.573172 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" path="/var/lib/kubelet/pods/e756ebeb-906e-4f9e-8abc-d254ffed03b7/volumes" Mar 09 14:18:58 crc kubenswrapper[4764]: I0309 14:18:58.370834 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:18:58 crc kubenswrapper[4764]: I0309 14:18:58.371473 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:18:58 crc kubenswrapper[4764]: I0309 14:18:58.371533 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 14:18:58 crc kubenswrapper[4764]: I0309 14:18:58.372513 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e0fd7c06957b904af3795728b54332eb3bc1c0b6a8df56e1a4bc3599bdedd8b5"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:18:58 crc kubenswrapper[4764]: I0309 14:18:58.372572 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://e0fd7c06957b904af3795728b54332eb3bc1c0b6a8df56e1a4bc3599bdedd8b5" gracePeriod=600 Mar 09 14:18:58 crc kubenswrapper[4764]: I0309 14:18:58.731137 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="e0fd7c06957b904af3795728b54332eb3bc1c0b6a8df56e1a4bc3599bdedd8b5" exitCode=0 Mar 09 14:18:58 crc kubenswrapper[4764]: I0309 14:18:58.731859 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"e0fd7c06957b904af3795728b54332eb3bc1c0b6a8df56e1a4bc3599bdedd8b5"} Mar 09 14:18:58 crc kubenswrapper[4764]: I0309 14:18:58.731907 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:18:59 crc kubenswrapper[4764]: I0309 14:18:59.744554 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096"} Mar 09 14:19:48 crc kubenswrapper[4764]: I0309 14:19:48.046253 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-fdg68"] Mar 09 14:19:48 crc kubenswrapper[4764]: I0309 14:19:48.055358 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-fdg68"] Mar 09 14:19:49 crc kubenswrapper[4764]: I0309 14:19:49.034612 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-1df0-account-create-update-bg564"] Mar 09 14:19:49 crc kubenswrapper[4764]: I0309 14:19:49.049523 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-1df0-account-create-update-bg564"] Mar 09 14:19:49 crc kubenswrapper[4764]: I0309 14:19:49.572114 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e1948a5-46f6-412d-91c3-bf9c255e02fc" path="/var/lib/kubelet/pods/5e1948a5-46f6-412d-91c3-bf9c255e02fc/volumes" Mar 09 14:19:49 crc kubenswrapper[4764]: I0309 14:19:49.573879 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ce9bce5-9c23-40ac-9683-6fb232e32c3c" path="/var/lib/kubelet/pods/7ce9bce5-9c23-40ac-9683-6fb232e32c3c/volumes" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.170891 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551100-l6b4v"] Mar 09 14:20:00 crc kubenswrapper[4764]: E0309 14:20:00.172303 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerName="extract-content" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.172325 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerName="extract-content" Mar 09 14:20:00 crc kubenswrapper[4764]: E0309 14:20:00.172343 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerName="extract-utilities" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.172350 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerName="extract-utilities" Mar 09 14:20:00 crc kubenswrapper[4764]: E0309 14:20:00.172369 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerName="registry-server" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.172376 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerName="registry-server" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.172721 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerName="registry-server" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.173811 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551100-l6b4v" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.177133 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.177509 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.178699 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.202262 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551100-l6b4v"] Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.272670 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl5lx\" (UniqueName: \"kubernetes.io/projected/ac3d8a45-0030-433c-a813-fa93811b952f-kube-api-access-xl5lx\") pod \"auto-csr-approver-29551100-l6b4v\" (UID: \"ac3d8a45-0030-433c-a813-fa93811b952f\") " pod="openshift-infra/auto-csr-approver-29551100-l6b4v" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.375347 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl5lx\" (UniqueName: \"kubernetes.io/projected/ac3d8a45-0030-433c-a813-fa93811b952f-kube-api-access-xl5lx\") pod \"auto-csr-approver-29551100-l6b4v\" (UID: \"ac3d8a45-0030-433c-a813-fa93811b952f\") " pod="openshift-infra/auto-csr-approver-29551100-l6b4v" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.396352 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl5lx\" (UniqueName: \"kubernetes.io/projected/ac3d8a45-0030-433c-a813-fa93811b952f-kube-api-access-xl5lx\") pod \"auto-csr-approver-29551100-l6b4v\" (UID: \"ac3d8a45-0030-433c-a813-fa93811b952f\") " pod="openshift-infra/auto-csr-approver-29551100-l6b4v" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.498608 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551100-l6b4v" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.986562 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551100-l6b4v"] Mar 09 14:20:01 crc kubenswrapper[4764]: I0309 14:20:01.389080 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551100-l6b4v" event={"ID":"ac3d8a45-0030-433c-a813-fa93811b952f","Type":"ContainerStarted","Data":"2f39d774e1aa1eb9ee3e30642b6b31806b792b18668b4e93d0df0af150d71706"} Mar 09 14:20:03 crc kubenswrapper[4764]: I0309 14:20:03.416113 4764 generic.go:334] "Generic (PLEG): container finished" podID="ac3d8a45-0030-433c-a813-fa93811b952f" containerID="0555c912480a032e6e1167209d44998b6306e07cd673abce316913bd071bd80f" exitCode=0 Mar 09 14:20:03 crc kubenswrapper[4764]: I0309 14:20:03.416236 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551100-l6b4v" event={"ID":"ac3d8a45-0030-433c-a813-fa93811b952f","Type":"ContainerDied","Data":"0555c912480a032e6e1167209d44998b6306e07cd673abce316913bd071bd80f"} Mar 09 14:20:04 crc kubenswrapper[4764]: I0309 14:20:04.818711 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551100-l6b4v" Mar 09 14:20:04 crc kubenswrapper[4764]: I0309 14:20:04.887316 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl5lx\" (UniqueName: \"kubernetes.io/projected/ac3d8a45-0030-433c-a813-fa93811b952f-kube-api-access-xl5lx\") pod \"ac3d8a45-0030-433c-a813-fa93811b952f\" (UID: \"ac3d8a45-0030-433c-a813-fa93811b952f\") " Mar 09 14:20:04 crc kubenswrapper[4764]: I0309 14:20:04.895720 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac3d8a45-0030-433c-a813-fa93811b952f-kube-api-access-xl5lx" (OuterVolumeSpecName: "kube-api-access-xl5lx") pod "ac3d8a45-0030-433c-a813-fa93811b952f" (UID: "ac3d8a45-0030-433c-a813-fa93811b952f"). InnerVolumeSpecName "kube-api-access-xl5lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:20:04 crc kubenswrapper[4764]: I0309 14:20:04.990902 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl5lx\" (UniqueName: \"kubernetes.io/projected/ac3d8a45-0030-433c-a813-fa93811b952f-kube-api-access-xl5lx\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:05 crc kubenswrapper[4764]: I0309 14:20:05.438496 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551100-l6b4v" event={"ID":"ac3d8a45-0030-433c-a813-fa93811b952f","Type":"ContainerDied","Data":"2f39d774e1aa1eb9ee3e30642b6b31806b792b18668b4e93d0df0af150d71706"} Mar 09 14:20:05 crc kubenswrapper[4764]: I0309 14:20:05.438549 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f39d774e1aa1eb9ee3e30642b6b31806b792b18668b4e93d0df0af150d71706" Mar 09 14:20:05 crc kubenswrapper[4764]: I0309 14:20:05.438708 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551100-l6b4v" Mar 09 14:20:05 crc kubenswrapper[4764]: I0309 14:20:05.895832 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551094-w5qt4"] Mar 09 14:20:05 crc kubenswrapper[4764]: I0309 14:20:05.905863 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551094-w5qt4"] Mar 09 14:20:07 crc kubenswrapper[4764]: I0309 14:20:07.574870 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7d72c38-b071-4fcb-89b4-935542a1943e" path="/var/lib/kubelet/pods/d7d72c38-b071-4fcb-89b4-935542a1943e/volumes" Mar 09 14:20:10 crc kubenswrapper[4764]: I0309 14:20:10.903667 4764 scope.go:117] "RemoveContainer" containerID="25dd59f6d05d4fb8f17a37a81bd52fddb49fdd1d34f6ac322a9adb36613a0c6b" Mar 09 14:20:10 crc kubenswrapper[4764]: I0309 14:20:10.935473 4764 scope.go:117] "RemoveContainer" containerID="1d30d3bafe785b61035cfd6776bbf1fbd30eae865400a81c90fd508159791920" Mar 09 14:20:10 crc kubenswrapper[4764]: I0309 14:20:10.966924 4764 scope.go:117] "RemoveContainer" containerID="bdf07c58d276db116b39b558183c8a6af0bb01c84a705a0b57c15a4ac4f6b634" Mar 09 14:20:11 crc kubenswrapper[4764]: I0309 14:20:11.029941 4764 scope.go:117] "RemoveContainer" containerID="a83e9cbe22286a479af2a92b6e16b2800ac926082aaf40b98e4defaf7a33610b" Mar 09 14:20:11 crc kubenswrapper[4764]: I0309 14:20:11.072791 4764 scope.go:117] "RemoveContainer" containerID="865365e28885b4a1f3c0039c8f24992ea422c88be0ea3e7581c981c450049a6c" Mar 09 14:20:11 crc kubenswrapper[4764]: I0309 14:20:11.160555 4764 scope.go:117] "RemoveContainer" containerID="89bd3940c282c4190fbd12e55d3b6572d76973664eb8e08171ebbd6ade38f50f" Mar 09 14:20:23 crc kubenswrapper[4764]: I0309 14:20:23.049040 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-j6s45"] Mar 09 14:20:23 crc kubenswrapper[4764]: I0309 14:20:23.062488 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-j6s45"] Mar 09 14:20:23 crc kubenswrapper[4764]: I0309 14:20:23.571928 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6711cdff-410c-4d91-b172-c2065054c1be" path="/var/lib/kubelet/pods/6711cdff-410c-4d91-b172-c2065054c1be/volumes" Mar 09 14:20:40 crc kubenswrapper[4764]: I0309 14:20:40.926489 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rbppf"] Mar 09 14:20:40 crc kubenswrapper[4764]: E0309 14:20:40.929620 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3d8a45-0030-433c-a813-fa93811b952f" containerName="oc" Mar 09 14:20:40 crc kubenswrapper[4764]: I0309 14:20:40.929792 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3d8a45-0030-433c-a813-fa93811b952f" containerName="oc" Mar 09 14:20:40 crc kubenswrapper[4764]: I0309 14:20:40.930091 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac3d8a45-0030-433c-a813-fa93811b952f" containerName="oc" Mar 09 14:20:40 crc kubenswrapper[4764]: I0309 14:20:40.932230 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:40 crc kubenswrapper[4764]: I0309 14:20:40.953971 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbppf"] Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.071945 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-utilities\") pod \"redhat-marketplace-rbppf\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.072150 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq8tf\" (UniqueName: \"kubernetes.io/projected/888fc58f-9a2b-4586-bc38-d645cae21425-kube-api-access-tq8tf\") pod \"redhat-marketplace-rbppf\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.072557 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-catalog-content\") pod \"redhat-marketplace-rbppf\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.175599 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-catalog-content\") pod \"redhat-marketplace-rbppf\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.175848 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-utilities\") pod \"redhat-marketplace-rbppf\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.175902 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq8tf\" (UniqueName: \"kubernetes.io/projected/888fc58f-9a2b-4586-bc38-d645cae21425-kube-api-access-tq8tf\") pod \"redhat-marketplace-rbppf\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.176352 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-catalog-content\") pod \"redhat-marketplace-rbppf\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.176467 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-utilities\") pod \"redhat-marketplace-rbppf\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.201592 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq8tf\" (UniqueName: \"kubernetes.io/projected/888fc58f-9a2b-4586-bc38-d645cae21425-kube-api-access-tq8tf\") pod \"redhat-marketplace-rbppf\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.255325 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.847667 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbppf"] Mar 09 14:20:42 crc kubenswrapper[4764]: I0309 14:20:42.858161 4764 generic.go:334] "Generic (PLEG): container finished" podID="888fc58f-9a2b-4586-bc38-d645cae21425" containerID="b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d" exitCode=0 Mar 09 14:20:42 crc kubenswrapper[4764]: I0309 14:20:42.858239 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbppf" event={"ID":"888fc58f-9a2b-4586-bc38-d645cae21425","Type":"ContainerDied","Data":"b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d"} Mar 09 14:20:42 crc kubenswrapper[4764]: I0309 14:20:42.858887 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbppf" event={"ID":"888fc58f-9a2b-4586-bc38-d645cae21425","Type":"ContainerStarted","Data":"54048bcbd0e67426acff49fbd46b2a61d8208845b90d713b788118d239a1246a"} Mar 09 14:20:43 crc kubenswrapper[4764]: I0309 14:20:43.879311 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbppf" event={"ID":"888fc58f-9a2b-4586-bc38-d645cae21425","Type":"ContainerStarted","Data":"fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048"} Mar 09 14:20:44 crc kubenswrapper[4764]: I0309 14:20:44.892421 4764 generic.go:334] "Generic (PLEG): container finished" podID="888fc58f-9a2b-4586-bc38-d645cae21425" containerID="fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048" exitCode=0 Mar 09 14:20:44 crc kubenswrapper[4764]: I0309 14:20:44.892473 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbppf" event={"ID":"888fc58f-9a2b-4586-bc38-d645cae21425","Type":"ContainerDied","Data":"fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048"} Mar 09 14:20:45 crc kubenswrapper[4764]: I0309 14:20:45.911341 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbppf" event={"ID":"888fc58f-9a2b-4586-bc38-d645cae21425","Type":"ContainerStarted","Data":"dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89"} Mar 09 14:20:45 crc kubenswrapper[4764]: I0309 14:20:45.946008 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rbppf" podStartSLOduration=3.49453088 podStartE2EDuration="5.945972442s" podCreationTimestamp="2026-03-09 14:20:40 +0000 UTC" firstStartedPulling="2026-03-09 14:20:42.861100001 +0000 UTC m=+3598.111271909" lastFinishedPulling="2026-03-09 14:20:45.312541563 +0000 UTC m=+3600.562713471" observedRunningTime="2026-03-09 14:20:45.932989764 +0000 UTC m=+3601.183161672" watchObservedRunningTime="2026-03-09 14:20:45.945972442 +0000 UTC m=+3601.196144350" Mar 09 14:20:51 crc kubenswrapper[4764]: I0309 14:20:51.255713 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:51 crc kubenswrapper[4764]: I0309 14:20:51.256612 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:51 crc kubenswrapper[4764]: I0309 14:20:51.310823 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:52 crc kubenswrapper[4764]: I0309 14:20:52.017442 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:52 crc kubenswrapper[4764]: I0309 14:20:52.082809 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbppf"] Mar 09 14:20:53 crc kubenswrapper[4764]: I0309 14:20:53.986429 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rbppf" podUID="888fc58f-9a2b-4586-bc38-d645cae21425" containerName="registry-server" containerID="cri-o://dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89" gracePeriod=2 Mar 09 14:20:54 crc kubenswrapper[4764]: I0309 14:20:54.528198 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:54 crc kubenswrapper[4764]: I0309 14:20:54.725046 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq8tf\" (UniqueName: \"kubernetes.io/projected/888fc58f-9a2b-4586-bc38-d645cae21425-kube-api-access-tq8tf\") pod \"888fc58f-9a2b-4586-bc38-d645cae21425\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " Mar 09 14:20:54 crc kubenswrapper[4764]: I0309 14:20:54.725522 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-utilities\") pod \"888fc58f-9a2b-4586-bc38-d645cae21425\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " Mar 09 14:20:54 crc kubenswrapper[4764]: I0309 14:20:54.725778 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-catalog-content\") pod \"888fc58f-9a2b-4586-bc38-d645cae21425\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " Mar 09 14:20:54 crc kubenswrapper[4764]: I0309 14:20:54.726309 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-utilities" (OuterVolumeSpecName: "utilities") pod "888fc58f-9a2b-4586-bc38-d645cae21425" (UID: "888fc58f-9a2b-4586-bc38-d645cae21425"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:20:54 crc kubenswrapper[4764]: I0309 14:20:54.726801 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:54 crc kubenswrapper[4764]: I0309 14:20:54.733298 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/888fc58f-9a2b-4586-bc38-d645cae21425-kube-api-access-tq8tf" (OuterVolumeSpecName: "kube-api-access-tq8tf") pod "888fc58f-9a2b-4586-bc38-d645cae21425" (UID: "888fc58f-9a2b-4586-bc38-d645cae21425"). InnerVolumeSpecName "kube-api-access-tq8tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:20:54 crc kubenswrapper[4764]: I0309 14:20:54.755110 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "888fc58f-9a2b-4586-bc38-d645cae21425" (UID: "888fc58f-9a2b-4586-bc38-d645cae21425"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:20:54 crc kubenswrapper[4764]: I0309 14:20:54.828715 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:54 crc kubenswrapper[4764]: I0309 14:20:54.828765 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq8tf\" (UniqueName: \"kubernetes.io/projected/888fc58f-9a2b-4586-bc38-d645cae21425-kube-api-access-tq8tf\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.000777 4764 generic.go:334] "Generic (PLEG): container finished" podID="888fc58f-9a2b-4586-bc38-d645cae21425" containerID="dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89" exitCode=0 Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.000883 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.000880 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbppf" event={"ID":"888fc58f-9a2b-4586-bc38-d645cae21425","Type":"ContainerDied","Data":"dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89"} Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.002635 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbppf" event={"ID":"888fc58f-9a2b-4586-bc38-d645cae21425","Type":"ContainerDied","Data":"54048bcbd0e67426acff49fbd46b2a61d8208845b90d713b788118d239a1246a"} Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.002683 4764 scope.go:117] "RemoveContainer" containerID="dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.046755 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbppf"] Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.053178 4764 scope.go:117] "RemoveContainer" containerID="fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.059897 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbppf"] Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.076818 4764 scope.go:117] "RemoveContainer" containerID="b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.123844 4764 scope.go:117] "RemoveContainer" containerID="dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89" Mar 09 14:20:55 crc kubenswrapper[4764]: E0309 14:20:55.124292 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89\": container with ID starting with dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89 not found: ID does not exist" containerID="dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.124329 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89"} err="failed to get container status \"dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89\": rpc error: code = NotFound desc = could not find container \"dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89\": container with ID starting with dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89 not found: ID does not exist" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.124354 4764 scope.go:117] "RemoveContainer" containerID="fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048" Mar 09 14:20:55 crc kubenswrapper[4764]: E0309 14:20:55.124692 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048\": container with ID starting with fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048 not found: ID does not exist" containerID="fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.124715 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048"} err="failed to get container status \"fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048\": rpc error: code = NotFound desc = could not find container \"fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048\": container with ID starting with fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048 not found: ID does not exist" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.124728 4764 scope.go:117] "RemoveContainer" containerID="b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d" Mar 09 14:20:55 crc kubenswrapper[4764]: E0309 14:20:55.124973 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d\": container with ID starting with b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d not found: ID does not exist" containerID="b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.124995 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d"} err="failed to get container status \"b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d\": rpc error: code = NotFound desc = could not find container \"b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d\": container with ID starting with b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d not found: ID does not exist" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.573467 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="888fc58f-9a2b-4586-bc38-d645cae21425" path="/var/lib/kubelet/pods/888fc58f-9a2b-4586-bc38-d645cae21425/volumes" Mar 09 14:20:58 crc kubenswrapper[4764]: I0309 14:20:58.370321 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:20:58 crc kubenswrapper[4764]: I0309 14:20:58.371153 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:21:11 crc kubenswrapper[4764]: I0309 14:21:11.318333 4764 scope.go:117] "RemoveContainer" containerID="8d377192006e66551cb0ea6108fb1f8c6b81bac19c99e99d5e21dffabe9e931d" Mar 09 14:21:28 crc kubenswrapper[4764]: I0309 14:21:28.370756 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:21:28 crc kubenswrapper[4764]: I0309 14:21:28.371572 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:21:58 crc kubenswrapper[4764]: I0309 14:21:58.370335 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:21:58 crc kubenswrapper[4764]: I0309 14:21:58.371187 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:21:58 crc kubenswrapper[4764]: I0309 14:21:58.371256 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 14:21:58 crc kubenswrapper[4764]: I0309 14:21:58.372270 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:21:58 crc kubenswrapper[4764]: I0309 14:21:58.372324 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" gracePeriod=600 Mar 09 14:21:58 crc kubenswrapper[4764]: E0309 14:21:58.494027 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:21:58 crc kubenswrapper[4764]: I0309 14:21:58.680685 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" exitCode=0 Mar 09 14:21:58 crc kubenswrapper[4764]: I0309 14:21:58.680757 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096"} Mar 09 14:21:58 crc kubenswrapper[4764]: I0309 14:21:58.681107 4764 scope.go:117] "RemoveContainer" containerID="e0fd7c06957b904af3795728b54332eb3bc1c0b6a8df56e1a4bc3599bdedd8b5" Mar 09 14:21:58 crc kubenswrapper[4764]: I0309 14:21:58.681663 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:21:58 crc kubenswrapper[4764]: E0309 14:21:58.682010 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.159047 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551102-gnwjq"] Mar 09 14:22:00 crc kubenswrapper[4764]: E0309 14:22:00.160103 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="888fc58f-9a2b-4586-bc38-d645cae21425" containerName="extract-content" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.160121 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="888fc58f-9a2b-4586-bc38-d645cae21425" containerName="extract-content" Mar 09 14:22:00 crc kubenswrapper[4764]: E0309 14:22:00.160154 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="888fc58f-9a2b-4586-bc38-d645cae21425" containerName="extract-utilities" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.160167 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="888fc58f-9a2b-4586-bc38-d645cae21425" containerName="extract-utilities" Mar 09 14:22:00 crc kubenswrapper[4764]: E0309 14:22:00.160194 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="888fc58f-9a2b-4586-bc38-d645cae21425" containerName="registry-server" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.160201 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="888fc58f-9a2b-4586-bc38-d645cae21425" containerName="registry-server" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.160390 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="888fc58f-9a2b-4586-bc38-d645cae21425" containerName="registry-server" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.161457 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551102-gnwjq" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.166700 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.166873 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.167036 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.171637 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551102-gnwjq"] Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.239010 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tfnd\" (UniqueName: \"kubernetes.io/projected/8f976f5d-f876-491a-8557-f6755b9641a3-kube-api-access-4tfnd\") pod \"auto-csr-approver-29551102-gnwjq\" (UID: \"8f976f5d-f876-491a-8557-f6755b9641a3\") " pod="openshift-infra/auto-csr-approver-29551102-gnwjq" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.341589 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tfnd\" (UniqueName: \"kubernetes.io/projected/8f976f5d-f876-491a-8557-f6755b9641a3-kube-api-access-4tfnd\") pod \"auto-csr-approver-29551102-gnwjq\" (UID: \"8f976f5d-f876-491a-8557-f6755b9641a3\") " pod="openshift-infra/auto-csr-approver-29551102-gnwjq" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.366567 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tfnd\" (UniqueName: \"kubernetes.io/projected/8f976f5d-f876-491a-8557-f6755b9641a3-kube-api-access-4tfnd\") pod \"auto-csr-approver-29551102-gnwjq\" (UID: \"8f976f5d-f876-491a-8557-f6755b9641a3\") " pod="openshift-infra/auto-csr-approver-29551102-gnwjq" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.514479 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551102-gnwjq" Mar 09 14:22:01 crc kubenswrapper[4764]: I0309 14:22:01.041904 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551102-gnwjq"] Mar 09 14:22:01 crc kubenswrapper[4764]: I0309 14:22:01.725287 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551102-gnwjq" event={"ID":"8f976f5d-f876-491a-8557-f6755b9641a3","Type":"ContainerStarted","Data":"6ce2883cdfe320e4d50c01ec1293e26b7d633f3bddc2555890a1b8545691ee5e"} Mar 09 14:22:02 crc kubenswrapper[4764]: I0309 14:22:02.741669 4764 generic.go:334] "Generic (PLEG): container finished" podID="8f976f5d-f876-491a-8557-f6755b9641a3" containerID="469e856eb9ee2a8bcc4f55156a6acb6b99b858dcd6fc4bec4aaa66e7dc6669a8" exitCode=0 Mar 09 14:22:02 crc kubenswrapper[4764]: I0309 14:22:02.741768 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551102-gnwjq" event={"ID":"8f976f5d-f876-491a-8557-f6755b9641a3","Type":"ContainerDied","Data":"469e856eb9ee2a8bcc4f55156a6acb6b99b858dcd6fc4bec4aaa66e7dc6669a8"} Mar 09 14:22:04 crc kubenswrapper[4764]: I0309 14:22:04.216307 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551102-gnwjq" Mar 09 14:22:04 crc kubenswrapper[4764]: I0309 14:22:04.335377 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tfnd\" (UniqueName: \"kubernetes.io/projected/8f976f5d-f876-491a-8557-f6755b9641a3-kube-api-access-4tfnd\") pod \"8f976f5d-f876-491a-8557-f6755b9641a3\" (UID: \"8f976f5d-f876-491a-8557-f6755b9641a3\") " Mar 09 14:22:04 crc kubenswrapper[4764]: I0309 14:22:04.342681 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f976f5d-f876-491a-8557-f6755b9641a3-kube-api-access-4tfnd" (OuterVolumeSpecName: "kube-api-access-4tfnd") pod "8f976f5d-f876-491a-8557-f6755b9641a3" (UID: "8f976f5d-f876-491a-8557-f6755b9641a3"). InnerVolumeSpecName "kube-api-access-4tfnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:22:04 crc kubenswrapper[4764]: I0309 14:22:04.437997 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tfnd\" (UniqueName: \"kubernetes.io/projected/8f976f5d-f876-491a-8557-f6755b9641a3-kube-api-access-4tfnd\") on node \"crc\" DevicePath \"\"" Mar 09 14:22:04 crc kubenswrapper[4764]: I0309 14:22:04.765152 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551102-gnwjq" event={"ID":"8f976f5d-f876-491a-8557-f6755b9641a3","Type":"ContainerDied","Data":"6ce2883cdfe320e4d50c01ec1293e26b7d633f3bddc2555890a1b8545691ee5e"} Mar 09 14:22:04 crc kubenswrapper[4764]: I0309 14:22:04.765208 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ce2883cdfe320e4d50c01ec1293e26b7d633f3bddc2555890a1b8545691ee5e" Mar 09 14:22:04 crc kubenswrapper[4764]: I0309 14:22:04.765220 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551102-gnwjq" Mar 09 14:22:05 crc kubenswrapper[4764]: I0309 14:22:05.295515 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551096-9bl56"] Mar 09 14:22:05 crc kubenswrapper[4764]: I0309 14:22:05.308111 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551096-9bl56"] Mar 09 14:22:05 crc kubenswrapper[4764]: I0309 14:22:05.577400 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3" path="/var/lib/kubelet/pods/40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3/volumes" Mar 09 14:22:11 crc kubenswrapper[4764]: I0309 14:22:11.427084 4764 scope.go:117] "RemoveContainer" containerID="ec447d2d121a3c0de24bcb688dbed1ad219802d16c0a98f40138fd4ac8dc4b32" Mar 09 14:22:11 crc kubenswrapper[4764]: I0309 14:22:11.559736 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:22:11 crc kubenswrapper[4764]: E0309 14:22:11.560135 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:22:24 crc kubenswrapper[4764]: I0309 14:22:24.560284 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:22:24 crc kubenswrapper[4764]: E0309 14:22:24.561453 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.186116 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fcpdb"] Mar 09 14:22:26 crc kubenswrapper[4764]: E0309 14:22:26.187229 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f976f5d-f876-491a-8557-f6755b9641a3" containerName="oc" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.187252 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f976f5d-f876-491a-8557-f6755b9641a3" containerName="oc" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.187521 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f976f5d-f876-491a-8557-f6755b9641a3" containerName="oc" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.189376 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.198795 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fcpdb"] Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.281074 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-utilities\") pod \"redhat-operators-fcpdb\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.281188 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2bwd\" (UniqueName: \"kubernetes.io/projected/4638c73f-5adb-4e39-b7d3-b1d6627b7705-kube-api-access-j2bwd\") pod \"redhat-operators-fcpdb\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.281863 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-catalog-content\") pod \"redhat-operators-fcpdb\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.384826 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-utilities\") pod \"redhat-operators-fcpdb\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.384952 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2bwd\" (UniqueName: \"kubernetes.io/projected/4638c73f-5adb-4e39-b7d3-b1d6627b7705-kube-api-access-j2bwd\") pod \"redhat-operators-fcpdb\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.385120 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-catalog-content\") pod \"redhat-operators-fcpdb\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.385607 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-utilities\") pod \"redhat-operators-fcpdb\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.385822 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-catalog-content\") pod \"redhat-operators-fcpdb\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.416694 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2bwd\" (UniqueName: \"kubernetes.io/projected/4638c73f-5adb-4e39-b7d3-b1d6627b7705-kube-api-access-j2bwd\") pod \"redhat-operators-fcpdb\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.524251 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:27 crc kubenswrapper[4764]: I0309 14:22:27.082075 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fcpdb"] Mar 09 14:22:28 crc kubenswrapper[4764]: I0309 14:22:28.051316 4764 generic.go:334] "Generic (PLEG): container finished" podID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerID="56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0" exitCode=0 Mar 09 14:22:28 crc kubenswrapper[4764]: I0309 14:22:28.051730 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcpdb" event={"ID":"4638c73f-5adb-4e39-b7d3-b1d6627b7705","Type":"ContainerDied","Data":"56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0"} Mar 09 14:22:28 crc kubenswrapper[4764]: I0309 14:22:28.051772 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcpdb" event={"ID":"4638c73f-5adb-4e39-b7d3-b1d6627b7705","Type":"ContainerStarted","Data":"d22ecbeedc703150c2c29bd0be02199d91646441b7ef06e81064110c6b81f83e"} Mar 09 14:22:29 crc kubenswrapper[4764]: I0309 14:22:29.066103 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcpdb" event={"ID":"4638c73f-5adb-4e39-b7d3-b1d6627b7705","Type":"ContainerStarted","Data":"4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329"} Mar 09 14:22:30 crc kubenswrapper[4764]: I0309 14:22:30.078411 4764 generic.go:334] "Generic (PLEG): container finished" podID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerID="4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329" exitCode=0 Mar 09 14:22:30 crc kubenswrapper[4764]: I0309 14:22:30.078500 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcpdb" event={"ID":"4638c73f-5adb-4e39-b7d3-b1d6627b7705","Type":"ContainerDied","Data":"4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329"} Mar 09 14:22:31 crc kubenswrapper[4764]: I0309 14:22:31.092958 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcpdb" event={"ID":"4638c73f-5adb-4e39-b7d3-b1d6627b7705","Type":"ContainerStarted","Data":"ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4"} Mar 09 14:22:31 crc kubenswrapper[4764]: I0309 14:22:31.124887 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fcpdb" podStartSLOduration=2.665947062 podStartE2EDuration="5.124865415s" podCreationTimestamp="2026-03-09 14:22:26 +0000 UTC" firstStartedPulling="2026-03-09 14:22:28.063672063 +0000 UTC m=+3703.313843971" lastFinishedPulling="2026-03-09 14:22:30.522590416 +0000 UTC m=+3705.772762324" observedRunningTime="2026-03-09 14:22:31.119710987 +0000 UTC m=+3706.369882905" watchObservedRunningTime="2026-03-09 14:22:31.124865415 +0000 UTC m=+3706.375037323" Mar 09 14:22:36 crc kubenswrapper[4764]: I0309 14:22:36.525171 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:36 crc kubenswrapper[4764]: I0309 14:22:36.526514 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:37 crc kubenswrapper[4764]: I0309 14:22:37.572738 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fcpdb" podUID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerName="registry-server" probeResult="failure" output=< Mar 09 14:22:37 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 09 14:22:37 crc kubenswrapper[4764]: > Mar 09 14:22:39 crc kubenswrapper[4764]: I0309 14:22:39.560787 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:22:39 crc kubenswrapper[4764]: E0309 14:22:39.561792 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:22:46 crc kubenswrapper[4764]: I0309 14:22:46.583123 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:46 crc kubenswrapper[4764]: I0309 14:22:46.636577 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:46 crc kubenswrapper[4764]: I0309 14:22:46.829138 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fcpdb"] Mar 09 14:22:48 crc kubenswrapper[4764]: I0309 14:22:48.310298 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fcpdb" podUID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerName="registry-server" containerID="cri-o://ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4" gracePeriod=2 Mar 09 14:22:48 crc kubenswrapper[4764]: I0309 14:22:48.925942 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.092305 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2bwd\" (UniqueName: \"kubernetes.io/projected/4638c73f-5adb-4e39-b7d3-b1d6627b7705-kube-api-access-j2bwd\") pod \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.092667 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-catalog-content\") pod \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.092767 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-utilities\") pod \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.095213 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-utilities" (OuterVolumeSpecName: "utilities") pod "4638c73f-5adb-4e39-b7d3-b1d6627b7705" (UID: "4638c73f-5adb-4e39-b7d3-b1d6627b7705"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.101023 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4638c73f-5adb-4e39-b7d3-b1d6627b7705-kube-api-access-j2bwd" (OuterVolumeSpecName: "kube-api-access-j2bwd") pod "4638c73f-5adb-4e39-b7d3-b1d6627b7705" (UID: "4638c73f-5adb-4e39-b7d3-b1d6627b7705"). InnerVolumeSpecName "kube-api-access-j2bwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.195702 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2bwd\" (UniqueName: \"kubernetes.io/projected/4638c73f-5adb-4e39-b7d3-b1d6627b7705-kube-api-access-j2bwd\") on node \"crc\" DevicePath \"\"" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.195742 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.242357 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4638c73f-5adb-4e39-b7d3-b1d6627b7705" (UID: "4638c73f-5adb-4e39-b7d3-b1d6627b7705"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.298840 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.324569 4764 generic.go:334] "Generic (PLEG): container finished" podID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerID="ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4" exitCode=0 Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.324629 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcpdb" event={"ID":"4638c73f-5adb-4e39-b7d3-b1d6627b7705","Type":"ContainerDied","Data":"ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4"} Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.324682 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcpdb" event={"ID":"4638c73f-5adb-4e39-b7d3-b1d6627b7705","Type":"ContainerDied","Data":"d22ecbeedc703150c2c29bd0be02199d91646441b7ef06e81064110c6b81f83e"} Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.324709 4764 scope.go:117] "RemoveContainer" containerID="ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.324955 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.367904 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fcpdb"] Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.367973 4764 scope.go:117] "RemoveContainer" containerID="4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.379099 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fcpdb"] Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.410385 4764 scope.go:117] "RemoveContainer" containerID="56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.462086 4764 scope.go:117] "RemoveContainer" containerID="ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4" Mar 09 14:22:49 crc kubenswrapper[4764]: E0309 14:22:49.462791 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4\": container with ID starting with ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4 not found: ID does not exist" containerID="ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.462856 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4"} err="failed to get container status \"ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4\": rpc error: code = NotFound desc = could not find container \"ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4\": container with ID starting with ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4 not found: ID does not exist" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.462893 4764 scope.go:117] "RemoveContainer" containerID="4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329" Mar 09 14:22:49 crc kubenswrapper[4764]: E0309 14:22:49.463330 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329\": container with ID starting with 4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329 not found: ID does not exist" containerID="4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.463362 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329"} err="failed to get container status \"4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329\": rpc error: code = NotFound desc = could not find container \"4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329\": container with ID starting with 4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329 not found: ID does not exist" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.463378 4764 scope.go:117] "RemoveContainer" containerID="56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0" Mar 09 14:22:49 crc kubenswrapper[4764]: E0309 14:22:49.463640 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0\": container with ID starting with 56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0 not found: ID does not exist" containerID="56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.463688 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0"} err="failed to get container status \"56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0\": rpc error: code = NotFound desc = could not find container \"56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0\": container with ID starting with 56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0 not found: ID does not exist" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.572386 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" path="/var/lib/kubelet/pods/4638c73f-5adb-4e39-b7d3-b1d6627b7705/volumes" Mar 09 14:22:50 crc kubenswrapper[4764]: I0309 14:22:50.561436 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:22:50 crc kubenswrapper[4764]: E0309 14:22:50.561824 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:23:02 crc kubenswrapper[4764]: I0309 14:23:02.560622 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:23:02 crc kubenswrapper[4764]: E0309 14:23:02.562021 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:23:16 crc kubenswrapper[4764]: I0309 14:23:16.560536 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:23:16 crc kubenswrapper[4764]: E0309 14:23:16.561571 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:23:31 crc kubenswrapper[4764]: I0309 14:23:31.560197 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:23:31 crc kubenswrapper[4764]: E0309 14:23:31.561336 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:23:46 crc kubenswrapper[4764]: I0309 14:23:46.560942 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:23:46 crc kubenswrapper[4764]: E0309 14:23:46.562209 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:23:58 crc kubenswrapper[4764]: I0309 14:23:58.560040 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:23:58 crc kubenswrapper[4764]: E0309 14:23:58.561258 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.174265 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551104-pc4h6"] Mar 09 14:24:00 crc kubenswrapper[4764]: E0309 14:24:00.175267 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerName="registry-server" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.175288 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerName="registry-server" Mar 09 14:24:00 crc kubenswrapper[4764]: E0309 14:24:00.175354 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerName="extract-utilities" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.175363 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerName="extract-utilities" Mar 09 14:24:00 crc kubenswrapper[4764]: E0309 14:24:00.175377 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerName="extract-content" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.175386 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerName="extract-content" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.175619 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerName="registry-server" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.176584 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551104-pc4h6" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.181324 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.181324 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.184532 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.202762 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551104-pc4h6"] Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.214312 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5hdp\" (UniqueName: \"kubernetes.io/projected/b4e97252-1933-4b92-ab28-f9713db14afb-kube-api-access-x5hdp\") pod \"auto-csr-approver-29551104-pc4h6\" (UID: \"b4e97252-1933-4b92-ab28-f9713db14afb\") " pod="openshift-infra/auto-csr-approver-29551104-pc4h6" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.317165 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5hdp\" (UniqueName: \"kubernetes.io/projected/b4e97252-1933-4b92-ab28-f9713db14afb-kube-api-access-x5hdp\") pod \"auto-csr-approver-29551104-pc4h6\" (UID: \"b4e97252-1933-4b92-ab28-f9713db14afb\") " pod="openshift-infra/auto-csr-approver-29551104-pc4h6" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.343179 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5hdp\" (UniqueName: \"kubernetes.io/projected/b4e97252-1933-4b92-ab28-f9713db14afb-kube-api-access-x5hdp\") pod \"auto-csr-approver-29551104-pc4h6\" (UID: \"b4e97252-1933-4b92-ab28-f9713db14afb\") " pod="openshift-infra/auto-csr-approver-29551104-pc4h6" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.499867 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551104-pc4h6" Mar 09 14:24:01 crc kubenswrapper[4764]: I0309 14:24:00.999944 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551104-pc4h6"] Mar 09 14:24:01 crc kubenswrapper[4764]: I0309 14:24:01.017130 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:24:01 crc kubenswrapper[4764]: I0309 14:24:01.161438 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551104-pc4h6" event={"ID":"b4e97252-1933-4b92-ab28-f9713db14afb","Type":"ContainerStarted","Data":"9bdfcd300abbbc3598563c8f1975cdc69a41fc3ac9b5564bdbcb92bcbc4b9f52"} Mar 09 14:24:03 crc kubenswrapper[4764]: I0309 14:24:03.185758 4764 generic.go:334] "Generic (PLEG): container finished" podID="b4e97252-1933-4b92-ab28-f9713db14afb" containerID="fbe599b37c79d3976b2347daa88e4c2b8bc846bfc99bce86dadf53c1a9ea4b91" exitCode=0 Mar 09 14:24:03 crc kubenswrapper[4764]: I0309 14:24:03.185827 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551104-pc4h6" event={"ID":"b4e97252-1933-4b92-ab28-f9713db14afb","Type":"ContainerDied","Data":"fbe599b37c79d3976b2347daa88e4c2b8bc846bfc99bce86dadf53c1a9ea4b91"} Mar 09 14:24:04 crc kubenswrapper[4764]: I0309 14:24:04.579347 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551104-pc4h6" Mar 09 14:24:04 crc kubenswrapper[4764]: I0309 14:24:04.628949 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5hdp\" (UniqueName: \"kubernetes.io/projected/b4e97252-1933-4b92-ab28-f9713db14afb-kube-api-access-x5hdp\") pod \"b4e97252-1933-4b92-ab28-f9713db14afb\" (UID: \"b4e97252-1933-4b92-ab28-f9713db14afb\") " Mar 09 14:24:04 crc kubenswrapper[4764]: I0309 14:24:04.644193 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4e97252-1933-4b92-ab28-f9713db14afb-kube-api-access-x5hdp" (OuterVolumeSpecName: "kube-api-access-x5hdp") pod "b4e97252-1933-4b92-ab28-f9713db14afb" (UID: "b4e97252-1933-4b92-ab28-f9713db14afb"). InnerVolumeSpecName "kube-api-access-x5hdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:04 crc kubenswrapper[4764]: I0309 14:24:04.732443 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5hdp\" (UniqueName: \"kubernetes.io/projected/b4e97252-1933-4b92-ab28-f9713db14afb-kube-api-access-x5hdp\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:05 crc kubenswrapper[4764]: I0309 14:24:05.206717 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551104-pc4h6" event={"ID":"b4e97252-1933-4b92-ab28-f9713db14afb","Type":"ContainerDied","Data":"9bdfcd300abbbc3598563c8f1975cdc69a41fc3ac9b5564bdbcb92bcbc4b9f52"} Mar 09 14:24:05 crc kubenswrapper[4764]: I0309 14:24:05.207090 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bdfcd300abbbc3598563c8f1975cdc69a41fc3ac9b5564bdbcb92bcbc4b9f52" Mar 09 14:24:05 crc kubenswrapper[4764]: I0309 14:24:05.206800 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551104-pc4h6" Mar 09 14:24:05 crc kubenswrapper[4764]: I0309 14:24:05.669416 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551098-7clls"] Mar 09 14:24:05 crc kubenswrapper[4764]: I0309 14:24:05.680869 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551098-7clls"] Mar 09 14:24:07 crc kubenswrapper[4764]: I0309 14:24:07.572450 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e86714ea-a59a-4955-b4a5-038ce0ce7bf6" path="/var/lib/kubelet/pods/e86714ea-a59a-4955-b4a5-038ce0ce7bf6/volumes" Mar 09 14:24:11 crc kubenswrapper[4764]: I0309 14:24:11.549859 4764 scope.go:117] "RemoveContainer" containerID="c9de23699d45475689268b26fadafce561fa4c6da1922b49b79542286bd5c95d" Mar 09 14:24:13 crc kubenswrapper[4764]: I0309 14:24:13.564796 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:24:13 crc kubenswrapper[4764]: E0309 14:24:13.565726 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:24:26 crc kubenswrapper[4764]: I0309 14:24:26.560033 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:24:26 crc kubenswrapper[4764]: E0309 14:24:26.561143 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:24:41 crc kubenswrapper[4764]: I0309 14:24:41.559921 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:24:41 crc kubenswrapper[4764]: E0309 14:24:41.561025 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:24:55 crc kubenswrapper[4764]: I0309 14:24:55.568080 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:24:55 crc kubenswrapper[4764]: E0309 14:24:55.569360 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:25:07 crc kubenswrapper[4764]: I0309 14:25:07.559767 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:25:07 crc kubenswrapper[4764]: E0309 14:25:07.564136 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:25:19 crc kubenswrapper[4764]: I0309 14:25:19.560716 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:25:19 crc kubenswrapper[4764]: E0309 14:25:19.561877 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:25:31 crc kubenswrapper[4764]: I0309 14:25:31.560555 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:25:31 crc kubenswrapper[4764]: E0309 14:25:31.561685 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:25:43 crc kubenswrapper[4764]: I0309 14:25:43.560410 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:25:43 crc kubenswrapper[4764]: E0309 14:25:43.561473 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:25:58 crc kubenswrapper[4764]: I0309 14:25:58.560803 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:25:58 crc kubenswrapper[4764]: E0309 14:25:58.561996 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.152556 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551106-dnhln"] Mar 09 14:26:00 crc kubenswrapper[4764]: E0309 14:26:00.153536 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4e97252-1933-4b92-ab28-f9713db14afb" containerName="oc" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.153556 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e97252-1933-4b92-ab28-f9713db14afb" containerName="oc" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.153828 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4e97252-1933-4b92-ab28-f9713db14afb" containerName="oc" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.158332 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551106-dnhln" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.171352 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.171519 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.176484 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.179042 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551106-dnhln"] Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.277939 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcxjc\" (UniqueName: \"kubernetes.io/projected/d33fc9b0-e440-4f1b-9522-1abec06eca2a-kube-api-access-vcxjc\") pod \"auto-csr-approver-29551106-dnhln\" (UID: \"d33fc9b0-e440-4f1b-9522-1abec06eca2a\") " pod="openshift-infra/auto-csr-approver-29551106-dnhln" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.381196 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcxjc\" (UniqueName: \"kubernetes.io/projected/d33fc9b0-e440-4f1b-9522-1abec06eca2a-kube-api-access-vcxjc\") pod \"auto-csr-approver-29551106-dnhln\" (UID: \"d33fc9b0-e440-4f1b-9522-1abec06eca2a\") " pod="openshift-infra/auto-csr-approver-29551106-dnhln" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.794192 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcxjc\" (UniqueName: \"kubernetes.io/projected/d33fc9b0-e440-4f1b-9522-1abec06eca2a-kube-api-access-vcxjc\") pod \"auto-csr-approver-29551106-dnhln\" (UID: \"d33fc9b0-e440-4f1b-9522-1abec06eca2a\") " pod="openshift-infra/auto-csr-approver-29551106-dnhln" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.830732 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551106-dnhln" Mar 09 14:26:01 crc kubenswrapper[4764]: I0309 14:26:01.343092 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551106-dnhln"] Mar 09 14:26:01 crc kubenswrapper[4764]: I0309 14:26:01.386244 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551106-dnhln" event={"ID":"d33fc9b0-e440-4f1b-9522-1abec06eca2a","Type":"ContainerStarted","Data":"e9244a9303f048aa1f4d232fd72882791bc82d03a9aa22f780d5079aac920863"} Mar 09 14:26:03 crc kubenswrapper[4764]: I0309 14:26:03.408160 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551106-dnhln" event={"ID":"d33fc9b0-e440-4f1b-9522-1abec06eca2a","Type":"ContainerStarted","Data":"8f9d999651aa510edfbd48cd8433fc728bf24d4c32451ce578a41c04f581a328"} Mar 09 14:26:03 crc kubenswrapper[4764]: I0309 14:26:03.433281 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551106-dnhln" podStartSLOduration=2.299167453 podStartE2EDuration="3.433242913s" podCreationTimestamp="2026-03-09 14:26:00 +0000 UTC" firstStartedPulling="2026-03-09 14:26:01.349812092 +0000 UTC m=+3916.599984000" lastFinishedPulling="2026-03-09 14:26:02.483887542 +0000 UTC m=+3917.734059460" observedRunningTime="2026-03-09 14:26:03.423746928 +0000 UTC m=+3918.673918846" watchObservedRunningTime="2026-03-09 14:26:03.433242913 +0000 UTC m=+3918.683414831" Mar 09 14:26:04 crc kubenswrapper[4764]: I0309 14:26:04.421061 4764 generic.go:334] "Generic (PLEG): container finished" podID="d33fc9b0-e440-4f1b-9522-1abec06eca2a" containerID="8f9d999651aa510edfbd48cd8433fc728bf24d4c32451ce578a41c04f581a328" exitCode=0 Mar 09 14:26:04 crc kubenswrapper[4764]: I0309 14:26:04.421108 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551106-dnhln" event={"ID":"d33fc9b0-e440-4f1b-9522-1abec06eca2a","Type":"ContainerDied","Data":"8f9d999651aa510edfbd48cd8433fc728bf24d4c32451ce578a41c04f581a328"} Mar 09 14:26:05 crc kubenswrapper[4764]: I0309 14:26:05.847613 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551106-dnhln" Mar 09 14:26:05 crc kubenswrapper[4764]: I0309 14:26:05.929637 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcxjc\" (UniqueName: \"kubernetes.io/projected/d33fc9b0-e440-4f1b-9522-1abec06eca2a-kube-api-access-vcxjc\") pod \"d33fc9b0-e440-4f1b-9522-1abec06eca2a\" (UID: \"d33fc9b0-e440-4f1b-9522-1abec06eca2a\") " Mar 09 14:26:05 crc kubenswrapper[4764]: I0309 14:26:05.937328 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d33fc9b0-e440-4f1b-9522-1abec06eca2a-kube-api-access-vcxjc" (OuterVolumeSpecName: "kube-api-access-vcxjc") pod "d33fc9b0-e440-4f1b-9522-1abec06eca2a" (UID: "d33fc9b0-e440-4f1b-9522-1abec06eca2a"). InnerVolumeSpecName "kube-api-access-vcxjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:06 crc kubenswrapper[4764]: I0309 14:26:06.033843 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcxjc\" (UniqueName: \"kubernetes.io/projected/d33fc9b0-e440-4f1b-9522-1abec06eca2a-kube-api-access-vcxjc\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:06 crc kubenswrapper[4764]: I0309 14:26:06.444454 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551106-dnhln" event={"ID":"d33fc9b0-e440-4f1b-9522-1abec06eca2a","Type":"ContainerDied","Data":"e9244a9303f048aa1f4d232fd72882791bc82d03a9aa22f780d5079aac920863"} Mar 09 14:26:06 crc kubenswrapper[4764]: I0309 14:26:06.444951 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9244a9303f048aa1f4d232fd72882791bc82d03a9aa22f780d5079aac920863" Mar 09 14:26:06 crc kubenswrapper[4764]: I0309 14:26:06.444613 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551106-dnhln" Mar 09 14:26:06 crc kubenswrapper[4764]: I0309 14:26:06.516028 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551100-l6b4v"] Mar 09 14:26:06 crc kubenswrapper[4764]: I0309 14:26:06.525201 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551100-l6b4v"] Mar 09 14:26:07 crc kubenswrapper[4764]: I0309 14:26:07.571845 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac3d8a45-0030-433c-a813-fa93811b952f" path="/var/lib/kubelet/pods/ac3d8a45-0030-433c-a813-fa93811b952f/volumes" Mar 09 14:26:11 crc kubenswrapper[4764]: I0309 14:26:11.667961 4764 scope.go:117] "RemoveContainer" containerID="0555c912480a032e6e1167209d44998b6306e07cd673abce316913bd071bd80f" Mar 09 14:26:13 crc kubenswrapper[4764]: I0309 14:26:13.560699 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:26:13 crc kubenswrapper[4764]: E0309 14:26:13.561483 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:26:27 crc kubenswrapper[4764]: I0309 14:26:27.561045 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:26:27 crc kubenswrapper[4764]: E0309 14:26:27.562842 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:26:38 crc kubenswrapper[4764]: I0309 14:26:38.560290 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:26:38 crc kubenswrapper[4764]: E0309 14:26:38.561493 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:26:53 crc kubenswrapper[4764]: I0309 14:26:53.561142 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:26:53 crc kubenswrapper[4764]: E0309 14:26:53.562223 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:27:04 crc kubenswrapper[4764]: I0309 14:27:04.560741 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:27:05 crc kubenswrapper[4764]: I0309 14:27:05.017435 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"956798d3640b6af2f25ed443a7b5b3c26e0da670aff756c574b1a30cc50879dd"} Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.153992 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551108-8tqjz"] Mar 09 14:28:00 crc kubenswrapper[4764]: E0309 14:28:00.155521 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d33fc9b0-e440-4f1b-9522-1abec06eca2a" containerName="oc" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.155541 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d33fc9b0-e440-4f1b-9522-1abec06eca2a" containerName="oc" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.155873 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d33fc9b0-e440-4f1b-9522-1abec06eca2a" containerName="oc" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.156905 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551108-8tqjz" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.159583 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.159757 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.160069 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.166883 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551108-8tqjz"] Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.258948 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs8d7\" (UniqueName: \"kubernetes.io/projected/1f49d558-1317-4099-abb0-bb57895b3917-kube-api-access-fs8d7\") pod \"auto-csr-approver-29551108-8tqjz\" (UID: \"1f49d558-1317-4099-abb0-bb57895b3917\") " pod="openshift-infra/auto-csr-approver-29551108-8tqjz" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.361220 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs8d7\" (UniqueName: \"kubernetes.io/projected/1f49d558-1317-4099-abb0-bb57895b3917-kube-api-access-fs8d7\") pod \"auto-csr-approver-29551108-8tqjz\" (UID: \"1f49d558-1317-4099-abb0-bb57895b3917\") " pod="openshift-infra/auto-csr-approver-29551108-8tqjz" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.386226 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs8d7\" (UniqueName: \"kubernetes.io/projected/1f49d558-1317-4099-abb0-bb57895b3917-kube-api-access-fs8d7\") pod \"auto-csr-approver-29551108-8tqjz\" (UID: \"1f49d558-1317-4099-abb0-bb57895b3917\") " pod="openshift-infra/auto-csr-approver-29551108-8tqjz" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.487817 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551108-8tqjz" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.998605 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551108-8tqjz"] Mar 09 14:28:01 crc kubenswrapper[4764]: I0309 14:28:01.686805 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551108-8tqjz" event={"ID":"1f49d558-1317-4099-abb0-bb57895b3917","Type":"ContainerStarted","Data":"4bf524a7497f95a2615264ebe34d165b98a41502eaf492b05240e40372c8f8f6"} Mar 09 14:28:02 crc kubenswrapper[4764]: E0309 14:28:02.634605 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f49d558_1317_4099_abb0_bb57895b3917.slice/crio-conmon-3d8746bde1498da01929d78c37701a3fb1698a0044a00e71ddbcca0d4465fb3c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f49d558_1317_4099_abb0_bb57895b3917.slice/crio-3d8746bde1498da01929d78c37701a3fb1698a0044a00e71ddbcca0d4465fb3c.scope\": RecentStats: unable to find data in memory cache]" Mar 09 14:28:02 crc kubenswrapper[4764]: I0309 14:28:02.699998 4764 generic.go:334] "Generic (PLEG): container finished" podID="1f49d558-1317-4099-abb0-bb57895b3917" containerID="3d8746bde1498da01929d78c37701a3fb1698a0044a00e71ddbcca0d4465fb3c" exitCode=0 Mar 09 14:28:02 crc kubenswrapper[4764]: I0309 14:28:02.700415 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551108-8tqjz" event={"ID":"1f49d558-1317-4099-abb0-bb57895b3917","Type":"ContainerDied","Data":"3d8746bde1498da01929d78c37701a3fb1698a0044a00e71ddbcca0d4465fb3c"} Mar 09 14:28:04 crc kubenswrapper[4764]: I0309 14:28:04.208673 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551108-8tqjz" Mar 09 14:28:04 crc kubenswrapper[4764]: I0309 14:28:04.385364 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs8d7\" (UniqueName: \"kubernetes.io/projected/1f49d558-1317-4099-abb0-bb57895b3917-kube-api-access-fs8d7\") pod \"1f49d558-1317-4099-abb0-bb57895b3917\" (UID: \"1f49d558-1317-4099-abb0-bb57895b3917\") " Mar 09 14:28:04 crc kubenswrapper[4764]: I0309 14:28:04.391741 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f49d558-1317-4099-abb0-bb57895b3917-kube-api-access-fs8d7" (OuterVolumeSpecName: "kube-api-access-fs8d7") pod "1f49d558-1317-4099-abb0-bb57895b3917" (UID: "1f49d558-1317-4099-abb0-bb57895b3917"). InnerVolumeSpecName "kube-api-access-fs8d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:04 crc kubenswrapper[4764]: I0309 14:28:04.490858 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs8d7\" (UniqueName: \"kubernetes.io/projected/1f49d558-1317-4099-abb0-bb57895b3917-kube-api-access-fs8d7\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:04 crc kubenswrapper[4764]: I0309 14:28:04.723130 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551108-8tqjz" event={"ID":"1f49d558-1317-4099-abb0-bb57895b3917","Type":"ContainerDied","Data":"4bf524a7497f95a2615264ebe34d165b98a41502eaf492b05240e40372c8f8f6"} Mar 09 14:28:04 crc kubenswrapper[4764]: I0309 14:28:04.723198 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bf524a7497f95a2615264ebe34d165b98a41502eaf492b05240e40372c8f8f6" Mar 09 14:28:04 crc kubenswrapper[4764]: I0309 14:28:04.723243 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551108-8tqjz" Mar 09 14:28:05 crc kubenswrapper[4764]: I0309 14:28:05.289446 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551102-gnwjq"] Mar 09 14:28:05 crc kubenswrapper[4764]: I0309 14:28:05.303954 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551102-gnwjq"] Mar 09 14:28:05 crc kubenswrapper[4764]: I0309 14:28:05.575280 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f976f5d-f876-491a-8557-f6755b9641a3" path="/var/lib/kubelet/pods/8f976f5d-f876-491a-8557-f6755b9641a3/volumes" Mar 09 14:28:11 crc kubenswrapper[4764]: I0309 14:28:11.803054 4764 scope.go:117] "RemoveContainer" containerID="469e856eb9ee2a8bcc4f55156a6acb6b99b858dcd6fc4bec4aaa66e7dc6669a8" Mar 09 14:28:51 crc kubenswrapper[4764]: I0309 14:28:51.194020 4764 generic.go:334] "Generic (PLEG): container finished" podID="5db22a0e-ee1a-4b26-9e49-b26644266834" containerID="88276b497758f3b5f0f6060dd845ad4b0ff499d7eeb52fb1af2c65524e2ceaba" exitCode=1 Mar 09 14:28:51 crc kubenswrapper[4764]: I0309 14:28:51.194114 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5db22a0e-ee1a-4b26-9e49-b26644266834","Type":"ContainerDied","Data":"88276b497758f3b5f0f6060dd845ad4b0ff499d7eeb52fb1af2c65524e2ceaba"} Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.591941 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.678928 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-temporary\") pod \"5db22a0e-ee1a-4b26-9e49-b26644266834\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.679141 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config-secret\") pod \"5db22a0e-ee1a-4b26-9e49-b26644266834\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.679183 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-workdir\") pod \"5db22a0e-ee1a-4b26-9e49-b26644266834\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.679213 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"5db22a0e-ee1a-4b26-9e49-b26644266834\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.679277 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ssh-key\") pod \"5db22a0e-ee1a-4b26-9e49-b26644266834\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.679358 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9db4k\" (UniqueName: \"kubernetes.io/projected/5db22a0e-ee1a-4b26-9e49-b26644266834-kube-api-access-9db4k\") pod \"5db22a0e-ee1a-4b26-9e49-b26644266834\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.679414 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config\") pod \"5db22a0e-ee1a-4b26-9e49-b26644266834\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.679479 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ca-certs\") pod \"5db22a0e-ee1a-4b26-9e49-b26644266834\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.679554 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-config-data\") pod \"5db22a0e-ee1a-4b26-9e49-b26644266834\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.679805 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "5db22a0e-ee1a-4b26-9e49-b26644266834" (UID: "5db22a0e-ee1a-4b26-9e49-b26644266834"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.680380 4764 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.680673 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-config-data" (OuterVolumeSpecName: "config-data") pod "5db22a0e-ee1a-4b26-9e49-b26644266834" (UID: "5db22a0e-ee1a-4b26-9e49-b26644266834"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.686807 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5db22a0e-ee1a-4b26-9e49-b26644266834-kube-api-access-9db4k" (OuterVolumeSpecName: "kube-api-access-9db4k") pod "5db22a0e-ee1a-4b26-9e49-b26644266834" (UID: "5db22a0e-ee1a-4b26-9e49-b26644266834"). InnerVolumeSpecName "kube-api-access-9db4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.686842 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "5db22a0e-ee1a-4b26-9e49-b26644266834" (UID: "5db22a0e-ee1a-4b26-9e49-b26644266834"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.686870 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "test-operator-logs") pod "5db22a0e-ee1a-4b26-9e49-b26644266834" (UID: "5db22a0e-ee1a-4b26-9e49-b26644266834"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.711914 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5db22a0e-ee1a-4b26-9e49-b26644266834" (UID: "5db22a0e-ee1a-4b26-9e49-b26644266834"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.712522 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5db22a0e-ee1a-4b26-9e49-b26644266834" (UID: "5db22a0e-ee1a-4b26-9e49-b26644266834"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.714853 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "5db22a0e-ee1a-4b26-9e49-b26644266834" (UID: "5db22a0e-ee1a-4b26-9e49-b26644266834"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.741121 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5db22a0e-ee1a-4b26-9e49-b26644266834" (UID: "5db22a0e-ee1a-4b26-9e49-b26644266834"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.782320 4764 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.782366 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.782381 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.782396 4764 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.782441 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.782452 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.782462 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9db4k\" (UniqueName: \"kubernetes.io/projected/5db22a0e-ee1a-4b26-9e49-b26644266834-kube-api-access-9db4k\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.782470 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.815351 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.884887 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:53 crc kubenswrapper[4764]: I0309 14:28:53.214633 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5db22a0e-ee1a-4b26-9e49-b26644266834","Type":"ContainerDied","Data":"bb1050b512fa7c7108bc1149d279ca654a7191121729b646cce9a919a3a91226"} Mar 09 14:28:53 crc kubenswrapper[4764]: I0309 14:28:53.214721 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb1050b512fa7c7108bc1149d279ca654a7191121729b646cce9a919a3a91226" Mar 09 14:28:53 crc kubenswrapper[4764]: I0309 14:28:53.214775 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 14:28:55 crc kubenswrapper[4764]: I0309 14:28:55.914325 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 09 14:28:55 crc kubenswrapper[4764]: E0309 14:28:55.916951 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5db22a0e-ee1a-4b26-9e49-b26644266834" containerName="tempest-tests-tempest-tests-runner" Mar 09 14:28:55 crc kubenswrapper[4764]: I0309 14:28:55.917057 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5db22a0e-ee1a-4b26-9e49-b26644266834" containerName="tempest-tests-tempest-tests-runner" Mar 09 14:28:55 crc kubenswrapper[4764]: E0309 14:28:55.917162 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f49d558-1317-4099-abb0-bb57895b3917" containerName="oc" Mar 09 14:28:55 crc kubenswrapper[4764]: I0309 14:28:55.917245 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f49d558-1317-4099-abb0-bb57895b3917" containerName="oc" Mar 09 14:28:55 crc kubenswrapper[4764]: I0309 14:28:55.917588 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f49d558-1317-4099-abb0-bb57895b3917" containerName="oc" Mar 09 14:28:55 crc kubenswrapper[4764]: I0309 14:28:55.917713 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5db22a0e-ee1a-4b26-9e49-b26644266834" containerName="tempest-tests-tempest-tests-runner" Mar 09 14:28:55 crc kubenswrapper[4764]: I0309 14:28:55.918722 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:28:55 crc kubenswrapper[4764]: I0309 14:28:55.921175 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-9bk55" Mar 09 14:28:55 crc kubenswrapper[4764]: I0309 14:28:55.928390 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 09 14:28:56 crc kubenswrapper[4764]: I0309 14:28:56.060709 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3b233056-629a-4653-8726-76e6b231e58b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:28:56 crc kubenswrapper[4764]: I0309 14:28:56.061140 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pbmq\" (UniqueName: \"kubernetes.io/projected/3b233056-629a-4653-8726-76e6b231e58b-kube-api-access-2pbmq\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3b233056-629a-4653-8726-76e6b231e58b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:28:56 crc kubenswrapper[4764]: I0309 14:28:56.163719 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3b233056-629a-4653-8726-76e6b231e58b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:28:56 crc kubenswrapper[4764]: I0309 14:28:56.163844 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pbmq\" (UniqueName: \"kubernetes.io/projected/3b233056-629a-4653-8726-76e6b231e58b-kube-api-access-2pbmq\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3b233056-629a-4653-8726-76e6b231e58b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:28:56 crc kubenswrapper[4764]: I0309 14:28:56.164491 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3b233056-629a-4653-8726-76e6b231e58b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:28:56 crc kubenswrapper[4764]: I0309 14:28:56.186458 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pbmq\" (UniqueName: \"kubernetes.io/projected/3b233056-629a-4653-8726-76e6b231e58b-kube-api-access-2pbmq\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3b233056-629a-4653-8726-76e6b231e58b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:28:56 crc kubenswrapper[4764]: I0309 14:28:56.193855 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3b233056-629a-4653-8726-76e6b231e58b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:28:56 crc kubenswrapper[4764]: I0309 14:28:56.236465 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:28:56 crc kubenswrapper[4764]: I0309 14:28:56.719293 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 09 14:28:57 crc kubenswrapper[4764]: I0309 14:28:57.271617 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"3b233056-629a-4653-8726-76e6b231e58b","Type":"ContainerStarted","Data":"038f360b2c61c4b55ed7bc17478170c2792e08704ea7f96f87972b775e992e5a"} Mar 09 14:28:58 crc kubenswrapper[4764]: I0309 14:28:58.287544 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"3b233056-629a-4653-8726-76e6b231e58b","Type":"ContainerStarted","Data":"ad51648649dfbb2f41e1f67e4374471c83a02d3db0063e492c33bb98690bf63c"} Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.666461 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=7.83190881 podStartE2EDuration="8.666434833s" podCreationTimestamp="2026-03-09 14:28:55 +0000 UTC" firstStartedPulling="2026-03-09 14:28:56.731603636 +0000 UTC m=+4091.981775544" lastFinishedPulling="2026-03-09 14:28:57.566129659 +0000 UTC m=+4092.816301567" observedRunningTime="2026-03-09 14:28:58.307147627 +0000 UTC m=+4093.557319555" watchObservedRunningTime="2026-03-09 14:29:03.666434833 +0000 UTC m=+4098.916606741" Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.673086 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ggs99"] Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.675721 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.690002 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ggs99"] Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.764408 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxvvn\" (UniqueName: \"kubernetes.io/projected/176982f0-3e86-471c-8054-13490ee485bb-kube-api-access-gxvvn\") pod \"community-operators-ggs99\" (UID: \"176982f0-3e86-471c-8054-13490ee485bb\") " pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.765219 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/176982f0-3e86-471c-8054-13490ee485bb-catalog-content\") pod \"community-operators-ggs99\" (UID: \"176982f0-3e86-471c-8054-13490ee485bb\") " pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.765351 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/176982f0-3e86-471c-8054-13490ee485bb-utilities\") pod \"community-operators-ggs99\" (UID: \"176982f0-3e86-471c-8054-13490ee485bb\") " pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.868014 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/176982f0-3e86-471c-8054-13490ee485bb-catalog-content\") pod \"community-operators-ggs99\" (UID: \"176982f0-3e86-471c-8054-13490ee485bb\") " pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.868078 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/176982f0-3e86-471c-8054-13490ee485bb-utilities\") pod \"community-operators-ggs99\" (UID: \"176982f0-3e86-471c-8054-13490ee485bb\") " pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.868141 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxvvn\" (UniqueName: \"kubernetes.io/projected/176982f0-3e86-471c-8054-13490ee485bb-kube-api-access-gxvvn\") pod \"community-operators-ggs99\" (UID: \"176982f0-3e86-471c-8054-13490ee485bb\") " pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.868719 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/176982f0-3e86-471c-8054-13490ee485bb-catalog-content\") pod \"community-operators-ggs99\" (UID: \"176982f0-3e86-471c-8054-13490ee485bb\") " pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.868879 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/176982f0-3e86-471c-8054-13490ee485bb-utilities\") pod \"community-operators-ggs99\" (UID: \"176982f0-3e86-471c-8054-13490ee485bb\") " pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.889909 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxvvn\" (UniqueName: \"kubernetes.io/projected/176982f0-3e86-471c-8054-13490ee485bb-kube-api-access-gxvvn\") pod \"community-operators-ggs99\" (UID: \"176982f0-3e86-471c-8054-13490ee485bb\") " pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.003719 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.103110 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8g2gm"] Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.105528 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.115673 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8g2gm"] Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.175515 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-catalog-content\") pod \"certified-operators-8g2gm\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.175886 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mlds\" (UniqueName: \"kubernetes.io/projected/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-kube-api-access-8mlds\") pod \"certified-operators-8g2gm\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.176177 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-utilities\") pod \"certified-operators-8g2gm\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.285098 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-catalog-content\") pod \"certified-operators-8g2gm\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.285197 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mlds\" (UniqueName: \"kubernetes.io/projected/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-kube-api-access-8mlds\") pod \"certified-operators-8g2gm\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.285259 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-utilities\") pod \"certified-operators-8g2gm\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.285833 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-utilities\") pod \"certified-operators-8g2gm\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.286063 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-catalog-content\") pod \"certified-operators-8g2gm\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.337071 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mlds\" (UniqueName: \"kubernetes.io/projected/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-kube-api-access-8mlds\") pod \"certified-operators-8g2gm\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.542001 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.702778 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ggs99"] Mar 09 14:29:05 crc kubenswrapper[4764]: I0309 14:29:05.091232 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8g2gm"] Mar 09 14:29:05 crc kubenswrapper[4764]: W0309 14:29:05.100049 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07b9f49e_cd4b_44ac_ba0b_22764e3372b3.slice/crio-b93b6586819dc583657fbe5fb0e96fd38cad070aa30246cf99d71f1f6cb66d31 WatchSource:0}: Error finding container b93b6586819dc583657fbe5fb0e96fd38cad070aa30246cf99d71f1f6cb66d31: Status 404 returned error can't find the container with id b93b6586819dc583657fbe5fb0e96fd38cad070aa30246cf99d71f1f6cb66d31 Mar 09 14:29:05 crc kubenswrapper[4764]: I0309 14:29:05.455216 4764 generic.go:334] "Generic (PLEG): container finished" podID="176982f0-3e86-471c-8054-13490ee485bb" containerID="d8c80c11cf0e967a1dfd8c4b93c815a24716ae93a258544f33ea5e567ff6734c" exitCode=0 Mar 09 14:29:05 crc kubenswrapper[4764]: I0309 14:29:05.455360 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggs99" event={"ID":"176982f0-3e86-471c-8054-13490ee485bb","Type":"ContainerDied","Data":"d8c80c11cf0e967a1dfd8c4b93c815a24716ae93a258544f33ea5e567ff6734c"} Mar 09 14:29:05 crc kubenswrapper[4764]: I0309 14:29:05.455451 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggs99" event={"ID":"176982f0-3e86-471c-8054-13490ee485bb","Type":"ContainerStarted","Data":"166077c5e519763979f01840287cd8d08640b9365b4dc9e96e36e1f7da6e7ee8"} Mar 09 14:29:05 crc kubenswrapper[4764]: I0309 14:29:05.457427 4764 generic.go:334] "Generic (PLEG): container finished" podID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerID="1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f" exitCode=0 Mar 09 14:29:05 crc kubenswrapper[4764]: I0309 14:29:05.457549 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8g2gm" event={"ID":"07b9f49e-cd4b-44ac-ba0b-22764e3372b3","Type":"ContainerDied","Data":"1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f"} Mar 09 14:29:05 crc kubenswrapper[4764]: I0309 14:29:05.457664 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8g2gm" event={"ID":"07b9f49e-cd4b-44ac-ba0b-22764e3372b3","Type":"ContainerStarted","Data":"b93b6586819dc583657fbe5fb0e96fd38cad070aa30246cf99d71f1f6cb66d31"} Mar 09 14:29:05 crc kubenswrapper[4764]: I0309 14:29:05.457602 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:29:06 crc kubenswrapper[4764]: I0309 14:29:06.470463 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8g2gm" event={"ID":"07b9f49e-cd4b-44ac-ba0b-22764e3372b3","Type":"ContainerStarted","Data":"300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa"} Mar 09 14:29:08 crc kubenswrapper[4764]: I0309 14:29:08.502375 4764 generic.go:334] "Generic (PLEG): container finished" podID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerID="300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa" exitCode=0 Mar 09 14:29:08 crc kubenswrapper[4764]: I0309 14:29:08.502459 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8g2gm" event={"ID":"07b9f49e-cd4b-44ac-ba0b-22764e3372b3","Type":"ContainerDied","Data":"300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa"} Mar 09 14:29:11 crc kubenswrapper[4764]: I0309 14:29:11.541804 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8g2gm" event={"ID":"07b9f49e-cd4b-44ac-ba0b-22764e3372b3","Type":"ContainerStarted","Data":"ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b"} Mar 09 14:29:11 crc kubenswrapper[4764]: I0309 14:29:11.544616 4764 generic.go:334] "Generic (PLEG): container finished" podID="176982f0-3e86-471c-8054-13490ee485bb" containerID="4a341a4b28fa7bb07af4dc3bc024fa42c3d793bbdd626124ab77730a3005a5c8" exitCode=0 Mar 09 14:29:11 crc kubenswrapper[4764]: I0309 14:29:11.544691 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggs99" event={"ID":"176982f0-3e86-471c-8054-13490ee485bb","Type":"ContainerDied","Data":"4a341a4b28fa7bb07af4dc3bc024fa42c3d793bbdd626124ab77730a3005a5c8"} Mar 09 14:29:11 crc kubenswrapper[4764]: I0309 14:29:11.600974 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8g2gm" podStartSLOduration=2.53528242 podStartE2EDuration="7.600949309s" podCreationTimestamp="2026-03-09 14:29:04 +0000 UTC" firstStartedPulling="2026-03-09 14:29:05.459101042 +0000 UTC m=+4100.709272950" lastFinishedPulling="2026-03-09 14:29:10.524767931 +0000 UTC m=+4105.774939839" observedRunningTime="2026-03-09 14:29:11.575114587 +0000 UTC m=+4106.825286505" watchObservedRunningTime="2026-03-09 14:29:11.600949309 +0000 UTC m=+4106.851121217" Mar 09 14:29:12 crc kubenswrapper[4764]: I0309 14:29:12.557210 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggs99" event={"ID":"176982f0-3e86-471c-8054-13490ee485bb","Type":"ContainerStarted","Data":"179e4497a5b2a885ab4606b5d2d90dd9f1fdbd0ba61ca5cc3671d4047b8df2cc"} Mar 09 14:29:12 crc kubenswrapper[4764]: I0309 14:29:12.586773 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ggs99" podStartSLOduration=3.055209586 podStartE2EDuration="9.586741066s" podCreationTimestamp="2026-03-09 14:29:03 +0000 UTC" firstStartedPulling="2026-03-09 14:29:05.457377586 +0000 UTC m=+4100.707549494" lastFinishedPulling="2026-03-09 14:29:11.988909066 +0000 UTC m=+4107.239080974" observedRunningTime="2026-03-09 14:29:12.577629882 +0000 UTC m=+4107.827801790" watchObservedRunningTime="2026-03-09 14:29:12.586741066 +0000 UTC m=+4107.836912974" Mar 09 14:29:14 crc kubenswrapper[4764]: I0309 14:29:14.004577 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:14 crc kubenswrapper[4764]: I0309 14:29:14.005042 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:14 crc kubenswrapper[4764]: I0309 14:29:14.542335 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:14 crc kubenswrapper[4764]: I0309 14:29:14.542848 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:14 crc kubenswrapper[4764]: I0309 14:29:14.596082 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:15 crc kubenswrapper[4764]: I0309 14:29:15.059194 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-ggs99" podUID="176982f0-3e86-471c-8054-13490ee485bb" containerName="registry-server" probeResult="failure" output=< Mar 09 14:29:15 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 09 14:29:15 crc kubenswrapper[4764]: > Mar 09 14:29:24 crc kubenswrapper[4764]: I0309 14:29:24.054853 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:24 crc kubenswrapper[4764]: I0309 14:29:24.122509 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:24 crc kubenswrapper[4764]: I0309 14:29:24.595322 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:26 crc kubenswrapper[4764]: I0309 14:29:26.973429 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ggs99"] Mar 09 14:29:27 crc kubenswrapper[4764]: I0309 14:29:27.345887 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4sxc8"] Mar 09 14:29:27 crc kubenswrapper[4764]: I0309 14:29:27.346632 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4sxc8" podUID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerName="registry-server" containerID="cri-o://cbaaee31f94d5b978452176039ab59d2520f7f3d0d136e8695a1528c41a1e96e" gracePeriod=2 Mar 09 14:29:27 crc kubenswrapper[4764]: I0309 14:29:27.722261 4764 generic.go:334] "Generic (PLEG): container finished" podID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerID="cbaaee31f94d5b978452176039ab59d2520f7f3d0d136e8695a1528c41a1e96e" exitCode=0 Mar 09 14:29:27 crc kubenswrapper[4764]: I0309 14:29:27.722855 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sxc8" event={"ID":"fa6ff5f6-9328-419b-a996-05bcf478b446","Type":"ContainerDied","Data":"cbaaee31f94d5b978452176039ab59d2520f7f3d0d136e8695a1528c41a1e96e"} Mar 09 14:29:27 crc kubenswrapper[4764]: I0309 14:29:27.918083 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4sxc8" Mar 09 14:29:27 crc kubenswrapper[4764]: I0309 14:29:27.959400 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8g2gm"] Mar 09 14:29:27 crc kubenswrapper[4764]: I0309 14:29:27.959784 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8g2gm" podUID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerName="registry-server" containerID="cri-o://ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b" gracePeriod=2 Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.109671 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p26kd\" (UniqueName: \"kubernetes.io/projected/fa6ff5f6-9328-419b-a996-05bcf478b446-kube-api-access-p26kd\") pod \"fa6ff5f6-9328-419b-a996-05bcf478b446\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.110003 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-catalog-content\") pod \"fa6ff5f6-9328-419b-a996-05bcf478b446\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.110263 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-utilities\") pod \"fa6ff5f6-9328-419b-a996-05bcf478b446\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.112448 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-utilities" (OuterVolumeSpecName: "utilities") pod "fa6ff5f6-9328-419b-a996-05bcf478b446" (UID: "fa6ff5f6-9328-419b-a996-05bcf478b446"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.123631 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa6ff5f6-9328-419b-a996-05bcf478b446-kube-api-access-p26kd" (OuterVolumeSpecName: "kube-api-access-p26kd") pod "fa6ff5f6-9328-419b-a996-05bcf478b446" (UID: "fa6ff5f6-9328-419b-a996-05bcf478b446"). InnerVolumeSpecName "kube-api-access-p26kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.200827 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa6ff5f6-9328-419b-a996-05bcf478b446" (UID: "fa6ff5f6-9328-419b-a996-05bcf478b446"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.219012 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.219058 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.219073 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p26kd\" (UniqueName: \"kubernetes.io/projected/fa6ff5f6-9328-419b-a996-05bcf478b446-kube-api-access-p26kd\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.370753 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.370960 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.400241 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.422030 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-catalog-content\") pod \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.422097 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-utilities\") pod \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.422274 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mlds\" (UniqueName: \"kubernetes.io/projected/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-kube-api-access-8mlds\") pod \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.426109 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-utilities" (OuterVolumeSpecName: "utilities") pod "07b9f49e-cd4b-44ac-ba0b-22764e3372b3" (UID: "07b9f49e-cd4b-44ac-ba0b-22764e3372b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.429609 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-kube-api-access-8mlds" (OuterVolumeSpecName: "kube-api-access-8mlds") pod "07b9f49e-cd4b-44ac-ba0b-22764e3372b3" (UID: "07b9f49e-cd4b-44ac-ba0b-22764e3372b3"). InnerVolumeSpecName "kube-api-access-8mlds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.511321 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "07b9f49e-cd4b-44ac-ba0b-22764e3372b3" (UID: "07b9f49e-cd4b-44ac-ba0b-22764e3372b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.525897 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mlds\" (UniqueName: \"kubernetes.io/projected/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-kube-api-access-8mlds\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.525943 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.525955 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.735212 4764 generic.go:334] "Generic (PLEG): container finished" podID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerID="ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b" exitCode=0 Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.735281 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8g2gm" event={"ID":"07b9f49e-cd4b-44ac-ba0b-22764e3372b3","Type":"ContainerDied","Data":"ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b"} Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.735317 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8g2gm" event={"ID":"07b9f49e-cd4b-44ac-ba0b-22764e3372b3","Type":"ContainerDied","Data":"b93b6586819dc583657fbe5fb0e96fd38cad070aa30246cf99d71f1f6cb66d31"} Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.735338 4764 scope.go:117] "RemoveContainer" containerID="ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.735477 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.745086 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sxc8" event={"ID":"fa6ff5f6-9328-419b-a996-05bcf478b446","Type":"ContainerDied","Data":"fa2faca2b6f361aab53dbf1e7c047f8fa427577595f01228a4ee0b2b2c128cfa"} Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.745171 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4sxc8" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.789936 4764 scope.go:117] "RemoveContainer" containerID="300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.807821 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8g2gm"] Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.819040 4764 scope.go:117] "RemoveContainer" containerID="1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.827580 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8g2gm"] Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.840712 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4sxc8"] Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.863553 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4sxc8"] Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.873530 4764 scope.go:117] "RemoveContainer" containerID="ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b" Mar 09 14:29:28 crc kubenswrapper[4764]: E0309 14:29:28.874151 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b\": container with ID starting with ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b not found: ID does not exist" containerID="ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.874190 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b"} err="failed to get container status \"ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b\": rpc error: code = NotFound desc = could not find container \"ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b\": container with ID starting with ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b not found: ID does not exist" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.874220 4764 scope.go:117] "RemoveContainer" containerID="300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa" Mar 09 14:29:28 crc kubenswrapper[4764]: E0309 14:29:28.874463 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa\": container with ID starting with 300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa not found: ID does not exist" containerID="300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.874488 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa"} err="failed to get container status \"300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa\": rpc error: code = NotFound desc = could not find container \"300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa\": container with ID starting with 300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa not found: ID does not exist" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.874503 4764 scope.go:117] "RemoveContainer" containerID="1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f" Mar 09 14:29:28 crc kubenswrapper[4764]: E0309 14:29:28.874797 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f\": container with ID starting with 1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f not found: ID does not exist" containerID="1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.874822 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f"} err="failed to get container status \"1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f\": rpc error: code = NotFound desc = could not find container \"1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f\": container with ID starting with 1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f not found: ID does not exist" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.874838 4764 scope.go:117] "RemoveContainer" containerID="cbaaee31f94d5b978452176039ab59d2520f7f3d0d136e8695a1528c41a1e96e" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.918565 4764 scope.go:117] "RemoveContainer" containerID="81f2426c617949ba2e886b6a6713e69e1e2306d5a032ede7615f1cde56b297b4" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.941299 4764 scope.go:117] "RemoveContainer" containerID="dab29cc25a32a534e61bf039add40e1f06d5ae10ca8ef07579cb0bf91b124a9d" Mar 09 14:29:29 crc kubenswrapper[4764]: I0309 14:29:29.572012 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" path="/var/lib/kubelet/pods/07b9f49e-cd4b-44ac-ba0b-22764e3372b3/volumes" Mar 09 14:29:29 crc kubenswrapper[4764]: I0309 14:29:29.573502 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa6ff5f6-9328-419b-a996-05bcf478b446" path="/var/lib/kubelet/pods/fa6ff5f6-9328-419b-a996-05bcf478b446/volumes" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.329389 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2n85d/must-gather-hmzh8"] Mar 09 14:29:36 crc kubenswrapper[4764]: E0309 14:29:36.331765 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerName="extract-content" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.331887 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerName="extract-content" Mar 09 14:29:36 crc kubenswrapper[4764]: E0309 14:29:36.331954 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerName="extract-content" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.332016 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerName="extract-content" Mar 09 14:29:36 crc kubenswrapper[4764]: E0309 14:29:36.332082 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerName="registry-server" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.332141 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerName="registry-server" Mar 09 14:29:36 crc kubenswrapper[4764]: E0309 14:29:36.332234 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerName="extract-utilities" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.332313 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerName="extract-utilities" Mar 09 14:29:36 crc kubenswrapper[4764]: E0309 14:29:36.332387 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerName="extract-utilities" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.332441 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerName="extract-utilities" Mar 09 14:29:36 crc kubenswrapper[4764]: E0309 14:29:36.332498 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerName="registry-server" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.332549 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerName="registry-server" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.332841 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerName="registry-server" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.332952 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerName="registry-server" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.334380 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/must-gather-hmzh8" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.338446 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2n85d"/"openshift-service-ca.crt" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.338842 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2n85d"/"kube-root-ca.crt" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.348358 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-2n85d"/"default-dockercfg-fjlv5" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.360552 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2n85d/must-gather-hmzh8"] Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.445636 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f4069cd4-c4ea-4c35-a8e3-231f40655d27-must-gather-output\") pod \"must-gather-hmzh8\" (UID: \"f4069cd4-c4ea-4c35-a8e3-231f40655d27\") " pod="openshift-must-gather-2n85d/must-gather-hmzh8" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.445802 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bslj\" (UniqueName: \"kubernetes.io/projected/f4069cd4-c4ea-4c35-a8e3-231f40655d27-kube-api-access-8bslj\") pod \"must-gather-hmzh8\" (UID: \"f4069cd4-c4ea-4c35-a8e3-231f40655d27\") " pod="openshift-must-gather-2n85d/must-gather-hmzh8" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.548614 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f4069cd4-c4ea-4c35-a8e3-231f40655d27-must-gather-output\") pod \"must-gather-hmzh8\" (UID: \"f4069cd4-c4ea-4c35-a8e3-231f40655d27\") " pod="openshift-must-gather-2n85d/must-gather-hmzh8" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.548783 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bslj\" (UniqueName: \"kubernetes.io/projected/f4069cd4-c4ea-4c35-a8e3-231f40655d27-kube-api-access-8bslj\") pod \"must-gather-hmzh8\" (UID: \"f4069cd4-c4ea-4c35-a8e3-231f40655d27\") " pod="openshift-must-gather-2n85d/must-gather-hmzh8" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.549031 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f4069cd4-c4ea-4c35-a8e3-231f40655d27-must-gather-output\") pod \"must-gather-hmzh8\" (UID: \"f4069cd4-c4ea-4c35-a8e3-231f40655d27\") " pod="openshift-must-gather-2n85d/must-gather-hmzh8" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.598277 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bslj\" (UniqueName: \"kubernetes.io/projected/f4069cd4-c4ea-4c35-a8e3-231f40655d27-kube-api-access-8bslj\") pod \"must-gather-hmzh8\" (UID: \"f4069cd4-c4ea-4c35-a8e3-231f40655d27\") " pod="openshift-must-gather-2n85d/must-gather-hmzh8" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.655157 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/must-gather-hmzh8" Mar 09 14:29:37 crc kubenswrapper[4764]: I0309 14:29:37.256721 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2n85d/must-gather-hmzh8"] Mar 09 14:29:37 crc kubenswrapper[4764]: I0309 14:29:37.860965 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/must-gather-hmzh8" event={"ID":"f4069cd4-c4ea-4c35-a8e3-231f40655d27","Type":"ContainerStarted","Data":"5c35375c12dfdebcd5a99ee269599da2c2da6b1470aa36489d7414f993910904"} Mar 09 14:29:44 crc kubenswrapper[4764]: I0309 14:29:44.961923 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/must-gather-hmzh8" event={"ID":"f4069cd4-c4ea-4c35-a8e3-231f40655d27","Type":"ContainerStarted","Data":"7a1d05f1a79149bd8bced16f5f4fb67221a689028074b5bf7fc3e37c2411e2d9"} Mar 09 14:29:44 crc kubenswrapper[4764]: I0309 14:29:44.962614 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/must-gather-hmzh8" event={"ID":"f4069cd4-c4ea-4c35-a8e3-231f40655d27","Type":"ContainerStarted","Data":"3751c9a7f60cedf52e5109a5d05ced7856abcae8f6d88733462034a07970747a"} Mar 09 14:29:44 crc kubenswrapper[4764]: I0309 14:29:44.990013 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2n85d/must-gather-hmzh8" podStartSLOduration=2.271023773 podStartE2EDuration="8.989983499s" podCreationTimestamp="2026-03-09 14:29:36 +0000 UTC" firstStartedPulling="2026-03-09 14:29:37.263912014 +0000 UTC m=+4132.514083922" lastFinishedPulling="2026-03-09 14:29:43.98287174 +0000 UTC m=+4139.233043648" observedRunningTime="2026-03-09 14:29:44.982809017 +0000 UTC m=+4140.232980935" watchObservedRunningTime="2026-03-09 14:29:44.989983499 +0000 UTC m=+4140.240155417" Mar 09 14:29:49 crc kubenswrapper[4764]: I0309 14:29:49.155456 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2n85d/crc-debug-9qlpp"] Mar 09 14:29:49 crc kubenswrapper[4764]: I0309 14:29:49.157760 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-9qlpp" Mar 09 14:29:49 crc kubenswrapper[4764]: I0309 14:29:49.183115 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25af9291-c5a5-4dff-9eb7-960615c614c1-host\") pod \"crc-debug-9qlpp\" (UID: \"25af9291-c5a5-4dff-9eb7-960615c614c1\") " pod="openshift-must-gather-2n85d/crc-debug-9qlpp" Mar 09 14:29:49 crc kubenswrapper[4764]: I0309 14:29:49.183405 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp82d\" (UniqueName: \"kubernetes.io/projected/25af9291-c5a5-4dff-9eb7-960615c614c1-kube-api-access-kp82d\") pod \"crc-debug-9qlpp\" (UID: \"25af9291-c5a5-4dff-9eb7-960615c614c1\") " pod="openshift-must-gather-2n85d/crc-debug-9qlpp" Mar 09 14:29:49 crc kubenswrapper[4764]: I0309 14:29:49.285903 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25af9291-c5a5-4dff-9eb7-960615c614c1-host\") pod \"crc-debug-9qlpp\" (UID: \"25af9291-c5a5-4dff-9eb7-960615c614c1\") " pod="openshift-must-gather-2n85d/crc-debug-9qlpp" Mar 09 14:29:49 crc kubenswrapper[4764]: I0309 14:29:49.286110 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25af9291-c5a5-4dff-9eb7-960615c614c1-host\") pod \"crc-debug-9qlpp\" (UID: \"25af9291-c5a5-4dff-9eb7-960615c614c1\") " pod="openshift-must-gather-2n85d/crc-debug-9qlpp" Mar 09 14:29:49 crc kubenswrapper[4764]: I0309 14:29:49.286709 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp82d\" (UniqueName: \"kubernetes.io/projected/25af9291-c5a5-4dff-9eb7-960615c614c1-kube-api-access-kp82d\") pod \"crc-debug-9qlpp\" (UID: \"25af9291-c5a5-4dff-9eb7-960615c614c1\") " pod="openshift-must-gather-2n85d/crc-debug-9qlpp" Mar 09 14:29:49 crc kubenswrapper[4764]: I0309 14:29:49.312348 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp82d\" (UniqueName: \"kubernetes.io/projected/25af9291-c5a5-4dff-9eb7-960615c614c1-kube-api-access-kp82d\") pod \"crc-debug-9qlpp\" (UID: \"25af9291-c5a5-4dff-9eb7-960615c614c1\") " pod="openshift-must-gather-2n85d/crc-debug-9qlpp" Mar 09 14:29:49 crc kubenswrapper[4764]: I0309 14:29:49.492192 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-9qlpp" Mar 09 14:29:49 crc kubenswrapper[4764]: W0309 14:29:49.542472 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25af9291_c5a5_4dff_9eb7_960615c614c1.slice/crio-881e8e6e7900e3f6e6c5bcf4497258776402b46797e6c19f23effb9b0d8b76ef WatchSource:0}: Error finding container 881e8e6e7900e3f6e6c5bcf4497258776402b46797e6c19f23effb9b0d8b76ef: Status 404 returned error can't find the container with id 881e8e6e7900e3f6e6c5bcf4497258776402b46797e6c19f23effb9b0d8b76ef Mar 09 14:29:50 crc kubenswrapper[4764]: I0309 14:29:50.011196 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/crc-debug-9qlpp" event={"ID":"25af9291-c5a5-4dff-9eb7-960615c614c1","Type":"ContainerStarted","Data":"881e8e6e7900e3f6e6c5bcf4497258776402b46797e6c19f23effb9b0d8b76ef"} Mar 09 14:29:58 crc kubenswrapper[4764]: I0309 14:29:58.370934 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:29:58 crc kubenswrapper[4764]: I0309 14:29:58.371948 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.160211 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551110-sjw8t"] Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.166545 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551110-sjw8t" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.170452 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.170517 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.172347 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.179879 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551110-sjw8t"] Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.181958 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqvzx\" (UniqueName: \"kubernetes.io/projected/44063b01-0b96-488c-98af-43cdb752467e-kube-api-access-wqvzx\") pod \"auto-csr-approver-29551110-sjw8t\" (UID: \"44063b01-0b96-488c-98af-43cdb752467e\") " pod="openshift-infra/auto-csr-approver-29551110-sjw8t" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.270688 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff"] Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.272518 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.275588 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.275815 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.283837 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6ndb\" (UniqueName: \"kubernetes.io/projected/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-kube-api-access-v6ndb\") pod \"collect-profiles-29551110-jq7ff\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.283940 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqvzx\" (UniqueName: \"kubernetes.io/projected/44063b01-0b96-488c-98af-43cdb752467e-kube-api-access-wqvzx\") pod \"auto-csr-approver-29551110-sjw8t\" (UID: \"44063b01-0b96-488c-98af-43cdb752467e\") " pod="openshift-infra/auto-csr-approver-29551110-sjw8t" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.283999 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-secret-volume\") pod \"collect-profiles-29551110-jq7ff\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.284289 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-config-volume\") pod \"collect-profiles-29551110-jq7ff\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.286246 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff"] Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.315606 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqvzx\" (UniqueName: \"kubernetes.io/projected/44063b01-0b96-488c-98af-43cdb752467e-kube-api-access-wqvzx\") pod \"auto-csr-approver-29551110-sjw8t\" (UID: \"44063b01-0b96-488c-98af-43cdb752467e\") " pod="openshift-infra/auto-csr-approver-29551110-sjw8t" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.386809 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6ndb\" (UniqueName: \"kubernetes.io/projected/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-kube-api-access-v6ndb\") pod \"collect-profiles-29551110-jq7ff\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.386963 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-secret-volume\") pod \"collect-profiles-29551110-jq7ff\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.387054 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-config-volume\") pod \"collect-profiles-29551110-jq7ff\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.388315 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-config-volume\") pod \"collect-profiles-29551110-jq7ff\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.393136 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-secret-volume\") pod \"collect-profiles-29551110-jq7ff\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.407252 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6ndb\" (UniqueName: \"kubernetes.io/projected/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-kube-api-access-v6ndb\") pod \"collect-profiles-29551110-jq7ff\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.688255 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.689791 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551110-sjw8t" Mar 09 14:30:04 crc kubenswrapper[4764]: I0309 14:30:04.436057 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff"] Mar 09 14:30:04 crc kubenswrapper[4764]: W0309 14:30:04.451036 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44063b01_0b96_488c_98af_43cdb752467e.slice/crio-9f463b037f6748b8413203836384045220979ce2e648b64c69d1e25e30dfec60 WatchSource:0}: Error finding container 9f463b037f6748b8413203836384045220979ce2e648b64c69d1e25e30dfec60: Status 404 returned error can't find the container with id 9f463b037f6748b8413203836384045220979ce2e648b64c69d1e25e30dfec60 Mar 09 14:30:04 crc kubenswrapper[4764]: I0309 14:30:04.451372 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551110-sjw8t"] Mar 09 14:30:05 crc kubenswrapper[4764]: I0309 14:30:05.180960 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/crc-debug-9qlpp" event={"ID":"25af9291-c5a5-4dff-9eb7-960615c614c1","Type":"ContainerStarted","Data":"48d1ac59fa8fdaa23f3a6ee1c8dc95eb7e7c19f15c77263002430602ffce1989"} Mar 09 14:30:05 crc kubenswrapper[4764]: I0309 14:30:05.184842 4764 generic.go:334] "Generic (PLEG): container finished" podID="8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1" containerID="7c8b928d8a9177ef5b9323c3811d5e0b452dba966aeb1e937b8cab94ebc62fd3" exitCode=0 Mar 09 14:30:05 crc kubenswrapper[4764]: I0309 14:30:05.184941 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" event={"ID":"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1","Type":"ContainerDied","Data":"7c8b928d8a9177ef5b9323c3811d5e0b452dba966aeb1e937b8cab94ebc62fd3"} Mar 09 14:30:05 crc kubenswrapper[4764]: I0309 14:30:05.184978 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" event={"ID":"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1","Type":"ContainerStarted","Data":"51fea13555bc7275650d667289bdcb28261d28171823fe98cc0700e0815f1df9"} Mar 09 14:30:05 crc kubenswrapper[4764]: I0309 14:30:05.192347 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551110-sjw8t" event={"ID":"44063b01-0b96-488c-98af-43cdb752467e","Type":"ContainerStarted","Data":"9f463b037f6748b8413203836384045220979ce2e648b64c69d1e25e30dfec60"} Mar 09 14:30:05 crc kubenswrapper[4764]: I0309 14:30:05.206452 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2n85d/crc-debug-9qlpp" podStartSLOduration=1.794139531 podStartE2EDuration="16.20642686s" podCreationTimestamp="2026-03-09 14:29:49 +0000 UTC" firstStartedPulling="2026-03-09 14:29:49.545538496 +0000 UTC m=+4144.795710404" lastFinishedPulling="2026-03-09 14:30:03.957825825 +0000 UTC m=+4159.207997733" observedRunningTime="2026-03-09 14:30:05.198509139 +0000 UTC m=+4160.448681047" watchObservedRunningTime="2026-03-09 14:30:05.20642686 +0000 UTC m=+4160.456598768" Mar 09 14:30:06 crc kubenswrapper[4764]: I0309 14:30:06.583781 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:06 crc kubenswrapper[4764]: I0309 14:30:06.647830 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-config-volume\") pod \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " Mar 09 14:30:06 crc kubenswrapper[4764]: I0309 14:30:06.647971 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6ndb\" (UniqueName: \"kubernetes.io/projected/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-kube-api-access-v6ndb\") pod \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " Mar 09 14:30:06 crc kubenswrapper[4764]: I0309 14:30:06.648199 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-secret-volume\") pod \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " Mar 09 14:30:06 crc kubenswrapper[4764]: I0309 14:30:06.651769 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-config-volume" (OuterVolumeSpecName: "config-volume") pod "8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1" (UID: "8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:30:06 crc kubenswrapper[4764]: I0309 14:30:06.662961 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-kube-api-access-v6ndb" (OuterVolumeSpecName: "kube-api-access-v6ndb") pod "8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1" (UID: "8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1"). InnerVolumeSpecName "kube-api-access-v6ndb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:30:06 crc kubenswrapper[4764]: I0309 14:30:06.663216 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1" (UID: "8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:30:06 crc kubenswrapper[4764]: I0309 14:30:06.751672 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:06 crc kubenswrapper[4764]: I0309 14:30:06.751726 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:06 crc kubenswrapper[4764]: I0309 14:30:06.751740 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6ndb\" (UniqueName: \"kubernetes.io/projected/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-kube-api-access-v6ndb\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:07 crc kubenswrapper[4764]: I0309 14:30:07.213268 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" event={"ID":"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1","Type":"ContainerDied","Data":"51fea13555bc7275650d667289bdcb28261d28171823fe98cc0700e0815f1df9"} Mar 09 14:30:07 crc kubenswrapper[4764]: I0309 14:30:07.214130 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51fea13555bc7275650d667289bdcb28261d28171823fe98cc0700e0815f1df9" Mar 09 14:30:07 crc kubenswrapper[4764]: I0309 14:30:07.213515 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:07 crc kubenswrapper[4764]: I0309 14:30:07.682902 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt"] Mar 09 14:30:07 crc kubenswrapper[4764]: I0309 14:30:07.697374 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt"] Mar 09 14:30:08 crc kubenswrapper[4764]: I0309 14:30:08.226152 4764 generic.go:334] "Generic (PLEG): container finished" podID="44063b01-0b96-488c-98af-43cdb752467e" containerID="b7f1a91e8b51914a12830dcf15f240318b51a9141210e70d3ec57af5cc977455" exitCode=0 Mar 09 14:30:08 crc kubenswrapper[4764]: I0309 14:30:08.226254 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551110-sjw8t" event={"ID":"44063b01-0b96-488c-98af-43cdb752467e","Type":"ContainerDied","Data":"b7f1a91e8b51914a12830dcf15f240318b51a9141210e70d3ec57af5cc977455"} Mar 09 14:30:09 crc kubenswrapper[4764]: I0309 14:30:09.574390 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d" path="/var/lib/kubelet/pods/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d/volumes" Mar 09 14:30:09 crc kubenswrapper[4764]: I0309 14:30:09.661908 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551110-sjw8t" Mar 09 14:30:09 crc kubenswrapper[4764]: I0309 14:30:09.725974 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqvzx\" (UniqueName: \"kubernetes.io/projected/44063b01-0b96-488c-98af-43cdb752467e-kube-api-access-wqvzx\") pod \"44063b01-0b96-488c-98af-43cdb752467e\" (UID: \"44063b01-0b96-488c-98af-43cdb752467e\") " Mar 09 14:30:09 crc kubenswrapper[4764]: I0309 14:30:09.737427 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44063b01-0b96-488c-98af-43cdb752467e-kube-api-access-wqvzx" (OuterVolumeSpecName: "kube-api-access-wqvzx") pod "44063b01-0b96-488c-98af-43cdb752467e" (UID: "44063b01-0b96-488c-98af-43cdb752467e"). InnerVolumeSpecName "kube-api-access-wqvzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:30:09 crc kubenswrapper[4764]: I0309 14:30:09.828823 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqvzx\" (UniqueName: \"kubernetes.io/projected/44063b01-0b96-488c-98af-43cdb752467e-kube-api-access-wqvzx\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:10 crc kubenswrapper[4764]: I0309 14:30:10.249476 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551110-sjw8t" event={"ID":"44063b01-0b96-488c-98af-43cdb752467e","Type":"ContainerDied","Data":"9f463b037f6748b8413203836384045220979ce2e648b64c69d1e25e30dfec60"} Mar 09 14:30:10 crc kubenswrapper[4764]: I0309 14:30:10.249916 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f463b037f6748b8413203836384045220979ce2e648b64c69d1e25e30dfec60" Mar 09 14:30:10 crc kubenswrapper[4764]: I0309 14:30:10.249570 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551110-sjw8t" Mar 09 14:30:10 crc kubenswrapper[4764]: I0309 14:30:10.743552 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551104-pc4h6"] Mar 09 14:30:10 crc kubenswrapper[4764]: I0309 14:30:10.755764 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551104-pc4h6"] Mar 09 14:30:11 crc kubenswrapper[4764]: I0309 14:30:11.571884 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4e97252-1933-4b92-ab28-f9713db14afb" path="/var/lib/kubelet/pods/b4e97252-1933-4b92-ab28-f9713db14afb/volumes" Mar 09 14:30:11 crc kubenswrapper[4764]: I0309 14:30:11.928100 4764 scope.go:117] "RemoveContainer" containerID="fbe599b37c79d3976b2347daa88e4c2b8bc846bfc99bce86dadf53c1a9ea4b91" Mar 09 14:30:12 crc kubenswrapper[4764]: I0309 14:30:12.000359 4764 scope.go:117] "RemoveContainer" containerID="797b0150f56f98282613dd9aa20b75479aae0c1306bab3b67c8a86168757b198" Mar 09 14:30:28 crc kubenswrapper[4764]: I0309 14:30:28.370263 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:30:28 crc kubenswrapper[4764]: I0309 14:30:28.371285 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:30:28 crc kubenswrapper[4764]: I0309 14:30:28.371362 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 14:30:28 crc kubenswrapper[4764]: I0309 14:30:28.372687 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"956798d3640b6af2f25ed443a7b5b3c26e0da670aff756c574b1a30cc50879dd"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:30:28 crc kubenswrapper[4764]: I0309 14:30:28.372776 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://956798d3640b6af2f25ed443a7b5b3c26e0da670aff756c574b1a30cc50879dd" gracePeriod=600 Mar 09 14:30:29 crc kubenswrapper[4764]: I0309 14:30:29.450108 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="956798d3640b6af2f25ed443a7b5b3c26e0da670aff756c574b1a30cc50879dd" exitCode=0 Mar 09 14:30:29 crc kubenswrapper[4764]: I0309 14:30:29.450197 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"956798d3640b6af2f25ed443a7b5b3c26e0da670aff756c574b1a30cc50879dd"} Mar 09 14:30:29 crc kubenswrapper[4764]: I0309 14:30:29.452071 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2"} Mar 09 14:30:29 crc kubenswrapper[4764]: I0309 14:30:29.452119 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.433075 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqd7"] Mar 09 14:30:44 crc kubenswrapper[4764]: E0309 14:30:44.438973 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44063b01-0b96-488c-98af-43cdb752467e" containerName="oc" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.439283 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="44063b01-0b96-488c-98af-43cdb752467e" containerName="oc" Mar 09 14:30:44 crc kubenswrapper[4764]: E0309 14:30:44.439381 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1" containerName="collect-profiles" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.439453 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1" containerName="collect-profiles" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.439845 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1" containerName="collect-profiles" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.439946 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="44063b01-0b96-488c-98af-43cdb752467e" containerName="oc" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.441777 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.454919 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqd7"] Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.581772 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-utilities\") pod \"redhat-marketplace-mdqd7\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.582285 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zz5t\" (UniqueName: \"kubernetes.io/projected/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-kube-api-access-9zz5t\") pod \"redhat-marketplace-mdqd7\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.582355 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-catalog-content\") pod \"redhat-marketplace-mdqd7\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.685227 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-utilities\") pod \"redhat-marketplace-mdqd7\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.685370 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zz5t\" (UniqueName: \"kubernetes.io/projected/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-kube-api-access-9zz5t\") pod \"redhat-marketplace-mdqd7\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.685458 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-catalog-content\") pod \"redhat-marketplace-mdqd7\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.688193 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-catalog-content\") pod \"redhat-marketplace-mdqd7\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.688473 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-utilities\") pod \"redhat-marketplace-mdqd7\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.718848 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zz5t\" (UniqueName: \"kubernetes.io/projected/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-kube-api-access-9zz5t\") pod \"redhat-marketplace-mdqd7\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.771030 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:45 crc kubenswrapper[4764]: W0309 14:30:45.306227 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbf30af3_6be9_48e2_8bf6_9fe8a0d4e1cd.slice/crio-e2faedc1de519d3df1b83f7d036261c9bb6514dba61e55a96fdcfbdf4aa98b4d WatchSource:0}: Error finding container e2faedc1de519d3df1b83f7d036261c9bb6514dba61e55a96fdcfbdf4aa98b4d: Status 404 returned error can't find the container with id e2faedc1de519d3df1b83f7d036261c9bb6514dba61e55a96fdcfbdf4aa98b4d Mar 09 14:30:45 crc kubenswrapper[4764]: I0309 14:30:45.312952 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqd7"] Mar 09 14:30:45 crc kubenswrapper[4764]: I0309 14:30:45.673801 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqd7" event={"ID":"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd","Type":"ContainerStarted","Data":"e2faedc1de519d3df1b83f7d036261c9bb6514dba61e55a96fdcfbdf4aa98b4d"} Mar 09 14:30:46 crc kubenswrapper[4764]: I0309 14:30:46.705388 4764 generic.go:334] "Generic (PLEG): container finished" podID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerID="52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3" exitCode=0 Mar 09 14:30:46 crc kubenswrapper[4764]: I0309 14:30:46.705451 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqd7" event={"ID":"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd","Type":"ContainerDied","Data":"52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3"} Mar 09 14:30:48 crc kubenswrapper[4764]: I0309 14:30:48.730361 4764 generic.go:334] "Generic (PLEG): container finished" podID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerID="4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74" exitCode=0 Mar 09 14:30:48 crc kubenswrapper[4764]: I0309 14:30:48.730899 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqd7" event={"ID":"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd","Type":"ContainerDied","Data":"4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74"} Mar 09 14:30:49 crc kubenswrapper[4764]: I0309 14:30:49.745581 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqd7" event={"ID":"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd","Type":"ContainerStarted","Data":"6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a"} Mar 09 14:30:54 crc kubenswrapper[4764]: I0309 14:30:54.771813 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:54 crc kubenswrapper[4764]: I0309 14:30:54.772712 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:54 crc kubenswrapper[4764]: I0309 14:30:54.824746 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:54 crc kubenswrapper[4764]: I0309 14:30:54.865523 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mdqd7" podStartSLOduration=8.396383339 podStartE2EDuration="10.865491691s" podCreationTimestamp="2026-03-09 14:30:44 +0000 UTC" firstStartedPulling="2026-03-09 14:30:46.711159631 +0000 UTC m=+4201.961331539" lastFinishedPulling="2026-03-09 14:30:49.180267983 +0000 UTC m=+4204.430439891" observedRunningTime="2026-03-09 14:30:49.790158296 +0000 UTC m=+4205.040330204" watchObservedRunningTime="2026-03-09 14:30:54.865491691 +0000 UTC m=+4210.115663609" Mar 09 14:30:54 crc kubenswrapper[4764]: I0309 14:30:54.884431 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:55 crc kubenswrapper[4764]: I0309 14:30:55.073588 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqd7"] Mar 09 14:30:55 crc kubenswrapper[4764]: I0309 14:30:55.823152 4764 generic.go:334] "Generic (PLEG): container finished" podID="25af9291-c5a5-4dff-9eb7-960615c614c1" containerID="48d1ac59fa8fdaa23f3a6ee1c8dc95eb7e7c19f15c77263002430602ffce1989" exitCode=0 Mar 09 14:30:55 crc kubenswrapper[4764]: I0309 14:30:55.823268 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/crc-debug-9qlpp" event={"ID":"25af9291-c5a5-4dff-9eb7-960615c614c1","Type":"ContainerDied","Data":"48d1ac59fa8fdaa23f3a6ee1c8dc95eb7e7c19f15c77263002430602ffce1989"} Mar 09 14:30:56 crc kubenswrapper[4764]: I0309 14:30:56.835724 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mdqd7" podUID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerName="registry-server" containerID="cri-o://6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a" gracePeriod=2 Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.060054 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-9qlpp" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.127463 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2n85d/crc-debug-9qlpp"] Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.130497 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp82d\" (UniqueName: \"kubernetes.io/projected/25af9291-c5a5-4dff-9eb7-960615c614c1-kube-api-access-kp82d\") pod \"25af9291-c5a5-4dff-9eb7-960615c614c1\" (UID: \"25af9291-c5a5-4dff-9eb7-960615c614c1\") " Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.130804 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25af9291-c5a5-4dff-9eb7-960615c614c1-host\") pod \"25af9291-c5a5-4dff-9eb7-960615c614c1\" (UID: \"25af9291-c5a5-4dff-9eb7-960615c614c1\") " Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.130895 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25af9291-c5a5-4dff-9eb7-960615c614c1-host" (OuterVolumeSpecName: "host") pod "25af9291-c5a5-4dff-9eb7-960615c614c1" (UID: "25af9291-c5a5-4dff-9eb7-960615c614c1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.132086 4764 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25af9291-c5a5-4dff-9eb7-960615c614c1-host\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.141978 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2n85d/crc-debug-9qlpp"] Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.143945 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25af9291-c5a5-4dff-9eb7-960615c614c1-kube-api-access-kp82d" (OuterVolumeSpecName: "kube-api-access-kp82d") pod "25af9291-c5a5-4dff-9eb7-960615c614c1" (UID: "25af9291-c5a5-4dff-9eb7-960615c614c1"). InnerVolumeSpecName "kube-api-access-kp82d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.234296 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp82d\" (UniqueName: \"kubernetes.io/projected/25af9291-c5a5-4dff-9eb7-960615c614c1-kube-api-access-kp82d\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.292637 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.437278 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-utilities\") pod \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.437521 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-catalog-content\") pod \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.438462 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-utilities" (OuterVolumeSpecName: "utilities") pod "cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" (UID: "cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.443799 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zz5t\" (UniqueName: \"kubernetes.io/projected/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-kube-api-access-9zz5t\") pod \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.448081 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.449194 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-kube-api-access-9zz5t" (OuterVolumeSpecName: "kube-api-access-9zz5t") pod "cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" (UID: "cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd"). InnerVolumeSpecName "kube-api-access-9zz5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.470830 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" (UID: "cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.550786 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.550833 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zz5t\" (UniqueName: \"kubernetes.io/projected/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-kube-api-access-9zz5t\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.576339 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25af9291-c5a5-4dff-9eb7-960615c614c1" path="/var/lib/kubelet/pods/25af9291-c5a5-4dff-9eb7-960615c614c1/volumes" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.849120 4764 scope.go:117] "RemoveContainer" containerID="48d1ac59fa8fdaa23f3a6ee1c8dc95eb7e7c19f15c77263002430602ffce1989" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.849136 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-9qlpp" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.852369 4764 generic.go:334] "Generic (PLEG): container finished" podID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerID="6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a" exitCode=0 Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.852407 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqd7" event={"ID":"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd","Type":"ContainerDied","Data":"6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a"} Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.852444 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqd7" event={"ID":"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd","Type":"ContainerDied","Data":"e2faedc1de519d3df1b83f7d036261c9bb6514dba61e55a96fdcfbdf4aa98b4d"} Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.852540 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.908327 4764 scope.go:117] "RemoveContainer" containerID="6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.970885 4764 scope.go:117] "RemoveContainer" containerID="4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.975856 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqd7"] Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.996813 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqd7"] Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.001326 4764 scope.go:117] "RemoveContainer" containerID="52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.052573 4764 scope.go:117] "RemoveContainer" containerID="6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a" Mar 09 14:30:58 crc kubenswrapper[4764]: E0309 14:30:58.053460 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a\": container with ID starting with 6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a not found: ID does not exist" containerID="6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.053527 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a"} err="failed to get container status \"6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a\": rpc error: code = NotFound desc = could not find container \"6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a\": container with ID starting with 6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a not found: ID does not exist" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.053570 4764 scope.go:117] "RemoveContainer" containerID="4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74" Mar 09 14:30:58 crc kubenswrapper[4764]: E0309 14:30:58.054018 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74\": container with ID starting with 4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74 not found: ID does not exist" containerID="4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.054079 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74"} err="failed to get container status \"4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74\": rpc error: code = NotFound desc = could not find container \"4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74\": container with ID starting with 4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74 not found: ID does not exist" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.054132 4764 scope.go:117] "RemoveContainer" containerID="52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3" Mar 09 14:30:58 crc kubenswrapper[4764]: E0309 14:30:58.055196 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3\": container with ID starting with 52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3 not found: ID does not exist" containerID="52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.055258 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3"} err="failed to get container status \"52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3\": rpc error: code = NotFound desc = could not find container \"52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3\": container with ID starting with 52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3 not found: ID does not exist" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.314202 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2n85d/crc-debug-qhjgj"] Mar 09 14:30:58 crc kubenswrapper[4764]: E0309 14:30:58.315001 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerName="registry-server" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.315027 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerName="registry-server" Mar 09 14:30:58 crc kubenswrapper[4764]: E0309 14:30:58.315044 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerName="extract-utilities" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.315053 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerName="extract-utilities" Mar 09 14:30:58 crc kubenswrapper[4764]: E0309 14:30:58.315075 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25af9291-c5a5-4dff-9eb7-960615c614c1" containerName="container-00" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.315085 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="25af9291-c5a5-4dff-9eb7-960615c614c1" containerName="container-00" Mar 09 14:30:58 crc kubenswrapper[4764]: E0309 14:30:58.315104 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerName="extract-content" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.315112 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerName="extract-content" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.315384 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerName="registry-server" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.315412 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="25af9291-c5a5-4dff-9eb7-960615c614c1" containerName="container-00" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.316579 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-qhjgj" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.478665 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t58w\" (UniqueName: \"kubernetes.io/projected/e08fcbdc-9242-4581-82ee-a3692f6d0d03-kube-api-access-8t58w\") pod \"crc-debug-qhjgj\" (UID: \"e08fcbdc-9242-4581-82ee-a3692f6d0d03\") " pod="openshift-must-gather-2n85d/crc-debug-qhjgj" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.479121 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e08fcbdc-9242-4581-82ee-a3692f6d0d03-host\") pod \"crc-debug-qhjgj\" (UID: \"e08fcbdc-9242-4581-82ee-a3692f6d0d03\") " pod="openshift-must-gather-2n85d/crc-debug-qhjgj" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.581558 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t58w\" (UniqueName: \"kubernetes.io/projected/e08fcbdc-9242-4581-82ee-a3692f6d0d03-kube-api-access-8t58w\") pod \"crc-debug-qhjgj\" (UID: \"e08fcbdc-9242-4581-82ee-a3692f6d0d03\") " pod="openshift-must-gather-2n85d/crc-debug-qhjgj" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.581788 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e08fcbdc-9242-4581-82ee-a3692f6d0d03-host\") pod \"crc-debug-qhjgj\" (UID: \"e08fcbdc-9242-4581-82ee-a3692f6d0d03\") " pod="openshift-must-gather-2n85d/crc-debug-qhjgj" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.582017 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e08fcbdc-9242-4581-82ee-a3692f6d0d03-host\") pod \"crc-debug-qhjgj\" (UID: \"e08fcbdc-9242-4581-82ee-a3692f6d0d03\") " pod="openshift-must-gather-2n85d/crc-debug-qhjgj" Mar 09 14:30:59 crc kubenswrapper[4764]: I0309 14:30:58.998165 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t58w\" (UniqueName: \"kubernetes.io/projected/e08fcbdc-9242-4581-82ee-a3692f6d0d03-kube-api-access-8t58w\") pod \"crc-debug-qhjgj\" (UID: \"e08fcbdc-9242-4581-82ee-a3692f6d0d03\") " pod="openshift-must-gather-2n85d/crc-debug-qhjgj" Mar 09 14:30:59 crc kubenswrapper[4764]: I0309 14:30:59.237958 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-qhjgj" Mar 09 14:30:59 crc kubenswrapper[4764]: I0309 14:30:59.572669 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" path="/var/lib/kubelet/pods/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd/volumes" Mar 09 14:30:59 crc kubenswrapper[4764]: I0309 14:30:59.880240 4764 generic.go:334] "Generic (PLEG): container finished" podID="e08fcbdc-9242-4581-82ee-a3692f6d0d03" containerID="35306a339ac014f00b9490fd62d260c20edfb15b36c977f1b4e50e10596530a4" exitCode=0 Mar 09 14:30:59 crc kubenswrapper[4764]: I0309 14:30:59.880310 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/crc-debug-qhjgj" event={"ID":"e08fcbdc-9242-4581-82ee-a3692f6d0d03","Type":"ContainerDied","Data":"35306a339ac014f00b9490fd62d260c20edfb15b36c977f1b4e50e10596530a4"} Mar 09 14:30:59 crc kubenswrapper[4764]: I0309 14:30:59.880345 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/crc-debug-qhjgj" event={"ID":"e08fcbdc-9242-4581-82ee-a3692f6d0d03","Type":"ContainerStarted","Data":"13cbfd7cca97d0bf58dc315609fa12973b5144274df00b90524a64d7a54d7be7"} Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.492239 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-qhjgj" Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.558526 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e08fcbdc-9242-4581-82ee-a3692f6d0d03-host\") pod \"e08fcbdc-9242-4581-82ee-a3692f6d0d03\" (UID: \"e08fcbdc-9242-4581-82ee-a3692f6d0d03\") " Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.558687 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e08fcbdc-9242-4581-82ee-a3692f6d0d03-host" (OuterVolumeSpecName: "host") pod "e08fcbdc-9242-4581-82ee-a3692f6d0d03" (UID: "e08fcbdc-9242-4581-82ee-a3692f6d0d03"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.559172 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t58w\" (UniqueName: \"kubernetes.io/projected/e08fcbdc-9242-4581-82ee-a3692f6d0d03-kube-api-access-8t58w\") pod \"e08fcbdc-9242-4581-82ee-a3692f6d0d03\" (UID: \"e08fcbdc-9242-4581-82ee-a3692f6d0d03\") " Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.559840 4764 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e08fcbdc-9242-4581-82ee-a3692f6d0d03-host\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.567165 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e08fcbdc-9242-4581-82ee-a3692f6d0d03-kube-api-access-8t58w" (OuterVolumeSpecName: "kube-api-access-8t58w") pod "e08fcbdc-9242-4581-82ee-a3692f6d0d03" (UID: "e08fcbdc-9242-4581-82ee-a3692f6d0d03"). InnerVolumeSpecName "kube-api-access-8t58w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.662032 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t58w\" (UniqueName: \"kubernetes.io/projected/e08fcbdc-9242-4581-82ee-a3692f6d0d03-kube-api-access-8t58w\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.913598 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/crc-debug-qhjgj" event={"ID":"e08fcbdc-9242-4581-82ee-a3692f6d0d03","Type":"ContainerDied","Data":"13cbfd7cca97d0bf58dc315609fa12973b5144274df00b90524a64d7a54d7be7"} Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.913993 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13cbfd7cca97d0bf58dc315609fa12973b5144274df00b90524a64d7a54d7be7" Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.915140 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-qhjgj" Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.968280 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2n85d/crc-debug-qhjgj"] Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.977745 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2n85d/crc-debug-qhjgj"] Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.168906 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2n85d/crc-debug-x4r2v"] Mar 09 14:31:03 crc kubenswrapper[4764]: E0309 14:31:03.169811 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e08fcbdc-9242-4581-82ee-a3692f6d0d03" containerName="container-00" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.169827 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e08fcbdc-9242-4581-82ee-a3692f6d0d03" containerName="container-00" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.170089 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e08fcbdc-9242-4581-82ee-a3692f6d0d03" containerName="container-00" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.170974 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-x4r2v" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.299673 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-275qw\" (UniqueName: \"kubernetes.io/projected/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-kube-api-access-275qw\") pod \"crc-debug-x4r2v\" (UID: \"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6\") " pod="openshift-must-gather-2n85d/crc-debug-x4r2v" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.299926 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-host\") pod \"crc-debug-x4r2v\" (UID: \"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6\") " pod="openshift-must-gather-2n85d/crc-debug-x4r2v" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.402243 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-host\") pod \"crc-debug-x4r2v\" (UID: \"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6\") " pod="openshift-must-gather-2n85d/crc-debug-x4r2v" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.402448 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-275qw\" (UniqueName: \"kubernetes.io/projected/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-kube-api-access-275qw\") pod \"crc-debug-x4r2v\" (UID: \"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6\") " pod="openshift-must-gather-2n85d/crc-debug-x4r2v" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.402488 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-host\") pod \"crc-debug-x4r2v\" (UID: \"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6\") " pod="openshift-must-gather-2n85d/crc-debug-x4r2v" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.424365 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-275qw\" (UniqueName: \"kubernetes.io/projected/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-kube-api-access-275qw\") pod \"crc-debug-x4r2v\" (UID: \"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6\") " pod="openshift-must-gather-2n85d/crc-debug-x4r2v" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.491283 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-x4r2v" Mar 09 14:31:03 crc kubenswrapper[4764]: W0309 14:31:03.546512 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30f041da_e1c7_4f92_b9b0_7c7b0495ecb6.slice/crio-15f0cbbd2a71e81bfebe228c5291dae19bace1e24e6d6ea39032fc0c8d21a940 WatchSource:0}: Error finding container 15f0cbbd2a71e81bfebe228c5291dae19bace1e24e6d6ea39032fc0c8d21a940: Status 404 returned error can't find the container with id 15f0cbbd2a71e81bfebe228c5291dae19bace1e24e6d6ea39032fc0c8d21a940 Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.571973 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e08fcbdc-9242-4581-82ee-a3692f6d0d03" path="/var/lib/kubelet/pods/e08fcbdc-9242-4581-82ee-a3692f6d0d03/volumes" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.938072 4764 generic.go:334] "Generic (PLEG): container finished" podID="30f041da-e1c7-4f92-b9b0-7c7b0495ecb6" containerID="3e7d6c7da6785672953242f3e84bfab6bab032496236e65977dca3d2ae3192cd" exitCode=0 Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.938139 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/crc-debug-x4r2v" event={"ID":"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6","Type":"ContainerDied","Data":"3e7d6c7da6785672953242f3e84bfab6bab032496236e65977dca3d2ae3192cd"} Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.938540 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/crc-debug-x4r2v" event={"ID":"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6","Type":"ContainerStarted","Data":"15f0cbbd2a71e81bfebe228c5291dae19bace1e24e6d6ea39032fc0c8d21a940"} Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.986974 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2n85d/crc-debug-x4r2v"] Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.999838 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2n85d/crc-debug-x4r2v"] Mar 09 14:31:05 crc kubenswrapper[4764]: I0309 14:31:05.053497 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-x4r2v" Mar 09 14:31:05 crc kubenswrapper[4764]: I0309 14:31:05.147697 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-host\") pod \"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6\" (UID: \"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6\") " Mar 09 14:31:05 crc kubenswrapper[4764]: I0309 14:31:05.147842 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-275qw\" (UniqueName: \"kubernetes.io/projected/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-kube-api-access-275qw\") pod \"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6\" (UID: \"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6\") " Mar 09 14:31:05 crc kubenswrapper[4764]: I0309 14:31:05.148082 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-host" (OuterVolumeSpecName: "host") pod "30f041da-e1c7-4f92-b9b0-7c7b0495ecb6" (UID: "30f041da-e1c7-4f92-b9b0-7c7b0495ecb6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:31:05 crc kubenswrapper[4764]: I0309 14:31:05.148499 4764 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-host\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:05 crc kubenswrapper[4764]: I0309 14:31:05.157111 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-kube-api-access-275qw" (OuterVolumeSpecName: "kube-api-access-275qw") pod "30f041da-e1c7-4f92-b9b0-7c7b0495ecb6" (UID: "30f041da-e1c7-4f92-b9b0-7c7b0495ecb6"). InnerVolumeSpecName "kube-api-access-275qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:31:05 crc kubenswrapper[4764]: I0309 14:31:05.250679 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-275qw\" (UniqueName: \"kubernetes.io/projected/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-kube-api-access-275qw\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:05 crc kubenswrapper[4764]: I0309 14:31:05.574755 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30f041da-e1c7-4f92-b9b0-7c7b0495ecb6" path="/var/lib/kubelet/pods/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6/volumes" Mar 09 14:31:05 crc kubenswrapper[4764]: I0309 14:31:05.960174 4764 scope.go:117] "RemoveContainer" containerID="3e7d6c7da6785672953242f3e84bfab6bab032496236e65977dca3d2ae3192cd" Mar 09 14:31:05 crc kubenswrapper[4764]: I0309 14:31:05.960510 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-x4r2v" Mar 09 14:31:36 crc kubenswrapper[4764]: I0309 14:31:36.010539 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-94887676d-fp9dl_b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19/barbican-api/0.log" Mar 09 14:31:36 crc kubenswrapper[4764]: I0309 14:31:36.901408 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7c6f54974-hws5g_154490f8-97ab-4703-a96c-16b6d5f7a178/barbican-keystone-listener/0.log" Mar 09 14:31:36 crc kubenswrapper[4764]: I0309 14:31:36.925118 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-94887676d-fp9dl_b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19/barbican-api-log/0.log" Mar 09 14:31:37 crc kubenswrapper[4764]: I0309 14:31:37.198676 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-789c56cf69-2dj2c_a18071d3-1164-4080-9095-919bb5349bb8/barbican-worker/0.log" Mar 09 14:31:37 crc kubenswrapper[4764]: I0309 14:31:37.260138 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-789c56cf69-2dj2c_a18071d3-1164-4080-9095-919bb5349bb8/barbican-worker-log/0.log" Mar 09 14:31:37 crc kubenswrapper[4764]: I0309 14:31:37.280161 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7c6f54974-hws5g_154490f8-97ab-4703-a96c-16b6d5f7a178/barbican-keystone-listener-log/0.log" Mar 09 14:31:37 crc kubenswrapper[4764]: I0309 14:31:37.514561 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk_7bd401e1-1592-4b49-8eb2-b6dcba296b36/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:37 crc kubenswrapper[4764]: I0309 14:31:37.526679 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_182886b1-5569-456a-aa1e-129021e95bfe/ceilometer-central-agent/0.log" Mar 09 14:31:37 crc kubenswrapper[4764]: I0309 14:31:37.746861 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_182886b1-5569-456a-aa1e-129021e95bfe/ceilometer-notification-agent/0.log" Mar 09 14:31:37 crc kubenswrapper[4764]: I0309 14:31:37.804245 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_182886b1-5569-456a-aa1e-129021e95bfe/sg-core/0.log" Mar 09 14:31:37 crc kubenswrapper[4764]: I0309 14:31:37.807082 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_182886b1-5569-456a-aa1e-129021e95bfe/proxy-httpd/0.log" Mar 09 14:31:37 crc kubenswrapper[4764]: I0309 14:31:37.978780 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp_e8ac27d6-e52e-4d38-b772-6ada493e746f/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:38 crc kubenswrapper[4764]: I0309 14:31:38.106096 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv_cbffc6a1-81df-479c-b40e-3f865c187a73/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:38 crc kubenswrapper[4764]: I0309 14:31:38.282142 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9cf43ab7-e625-4ffa-9af4-9f810a43d270/cinder-api/0.log" Mar 09 14:31:38 crc kubenswrapper[4764]: I0309 14:31:38.297784 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9cf43ab7-e625-4ffa-9af4-9f810a43d270/cinder-api-log/0.log" Mar 09 14:31:38 crc kubenswrapper[4764]: I0309 14:31:38.617259 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e6a8f674-82eb-4474-973d-54a90e5fd1e0/cinder-backup/0.log" Mar 09 14:31:38 crc kubenswrapper[4764]: I0309 14:31:38.623288 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e6a8f674-82eb-4474-973d-54a90e5fd1e0/probe/0.log" Mar 09 14:31:38 crc kubenswrapper[4764]: I0309 14:31:38.695518 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_05d58314-31c8-4b6a-8c8c-1dc211d9f424/cinder-scheduler/0.log" Mar 09 14:31:39 crc kubenswrapper[4764]: I0309 14:31:39.429962 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_05d58314-31c8-4b6a-8c8c-1dc211d9f424/probe/0.log" Mar 09 14:31:39 crc kubenswrapper[4764]: I0309 14:31:39.489802 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_9aaa370f-a3d5-4fce-9761-873aeb8d7b1f/cinder-volume/0.log" Mar 09 14:31:39 crc kubenswrapper[4764]: I0309 14:31:39.531823 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_9aaa370f-a3d5-4fce-9761-873aeb8d7b1f/probe/0.log" Mar 09 14:31:39 crc kubenswrapper[4764]: I0309 14:31:39.975626 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-trkg2_5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:40 crc kubenswrapper[4764]: I0309 14:31:40.162008 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl_ea3a2b04-e009-4dcd-8eca-543cc084b329/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:40 crc kubenswrapper[4764]: I0309 14:31:40.342597 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69655fd4bf-qlzqn_1552c7db-c992-4b43-8f1e-2b752d718f36/init/0.log" Mar 09 14:31:40 crc kubenswrapper[4764]: I0309 14:31:40.576202 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69655fd4bf-qlzqn_1552c7db-c992-4b43-8f1e-2b752d718f36/init/0.log" Mar 09 14:31:40 crc kubenswrapper[4764]: I0309 14:31:40.604002 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69655fd4bf-qlzqn_1552c7db-c992-4b43-8f1e-2b752d718f36/dnsmasq-dns/0.log" Mar 09 14:31:40 crc kubenswrapper[4764]: I0309 14:31:40.636322 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_22563404-fb5a-4d95-bae1-dd24d6fcc8d1/glance-httpd/0.log" Mar 09 14:31:40 crc kubenswrapper[4764]: I0309 14:31:40.761215 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_22563404-fb5a-4d95-bae1-dd24d6fcc8d1/glance-log/0.log" Mar 09 14:31:40 crc kubenswrapper[4764]: I0309 14:31:40.858280 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_66d58a1b-5d94-4d28-bcb3-0b20f0516eab/glance-httpd/0.log" Mar 09 14:31:40 crc kubenswrapper[4764]: I0309 14:31:40.862502 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_66d58a1b-5d94-4d28-bcb3-0b20f0516eab/glance-log/0.log" Mar 09 14:31:41 crc kubenswrapper[4764]: I0309 14:31:41.151500 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s_949d7512-b3be-4068-b05a-20589fbc2b52/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:41 crc kubenswrapper[4764]: I0309 14:31:41.156879 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-56bb55c768-vchmw_47ef29f6-4627-4b84-968d-db9d7ed438da/horizon/0.log" Mar 09 14:31:41 crc kubenswrapper[4764]: I0309 14:31:41.264958 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-56bb55c768-vchmw_47ef29f6-4627-4b84-968d-db9d7ed438da/horizon-log/0.log" Mar 09 14:31:41 crc kubenswrapper[4764]: I0309 14:31:41.434814 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-bhp5p_2d2ddcdd-77bf-4dc5-8170-02d297378dcb/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:41 crc kubenswrapper[4764]: I0309 14:31:41.609231 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29551081-wz9hv_6ec256c5-cf20-4b12-bb84-0f5d3e02460a/keystone-cron/0.log" Mar 09 14:31:41 crc kubenswrapper[4764]: I0309 14:31:41.688948 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_179736ec-4215-4ad8-9800-a186978a767f/kube-state-metrics/0.log" Mar 09 14:31:41 crc kubenswrapper[4764]: I0309 14:31:41.898147 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-9pl98_0a9ed7f5-c296-41ac-ae0d-5845c66a385a/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:42 crc kubenswrapper[4764]: I0309 14:31:42.439006 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-759c9c64fb-nwls6_48b871c4-f2e8-44e9-9268-54920414c084/keystone-api/0.log" Mar 09 14:31:42 crc kubenswrapper[4764]: I0309 14:31:42.527675 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_492d78a8-09ea-4239-a53f-b8d0480fcf36/probe/0.log" Mar 09 14:31:42 crc kubenswrapper[4764]: I0309 14:31:42.641073 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d/manila-api/0.log" Mar 09 14:31:42 crc kubenswrapper[4764]: I0309 14:31:42.667340 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_492d78a8-09ea-4239-a53f-b8d0480fcf36/manila-scheduler/0.log" Mar 09 14:31:42 crc kubenswrapper[4764]: I0309 14:31:42.935448 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_2caafd00-b539-4f40-b1c6-af6957bcb458/probe/0.log" Mar 09 14:31:43 crc kubenswrapper[4764]: I0309 14:31:43.112288 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_2caafd00-b539-4f40-b1c6-af6957bcb458/manila-share/0.log" Mar 09 14:31:43 crc kubenswrapper[4764]: I0309 14:31:43.345268 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d/manila-api-log/0.log" Mar 09 14:31:43 crc kubenswrapper[4764]: I0309 14:31:43.367088 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6b7bfdfd5-56dnz_fd7dadfc-b8e4-479f-8880-4ffeec051d30/neutron-api/0.log" Mar 09 14:31:43 crc kubenswrapper[4764]: I0309 14:31:43.440187 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6b7bfdfd5-56dnz_fd7dadfc-b8e4-479f-8880-4ffeec051d30/neutron-httpd/0.log" Mar 09 14:31:43 crc kubenswrapper[4764]: I0309 14:31:43.592276 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj_8a38f1e2-ce88-47d9-883d-4d95c781d181/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:43 crc kubenswrapper[4764]: I0309 14:31:43.982950 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ec790643-05dd-4f21-82f8-ad1586087d85/nova-api-log/0.log" Mar 09 14:31:44 crc kubenswrapper[4764]: I0309 14:31:44.123673 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9/nova-cell0-conductor-conductor/0.log" Mar 09 14:31:44 crc kubenswrapper[4764]: I0309 14:31:44.255369 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ec790643-05dd-4f21-82f8-ad1586087d85/nova-api-api/0.log" Mar 09 14:31:44 crc kubenswrapper[4764]: I0309 14:31:44.327925 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_959eb23f-c4b4-4f35-b284-38212848a084/nova-cell1-conductor-conductor/0.log" Mar 09 14:31:44 crc kubenswrapper[4764]: I0309 14:31:44.532434 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_6932dd15-578a-4965-bcb9-b506d4e3cd2f/nova-cell1-novncproxy-novncproxy/0.log" Mar 09 14:31:44 crc kubenswrapper[4764]: I0309 14:31:44.672714 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq_eab144b6-e27c-4ffc-9dd5-6236ca12719f/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:44 crc kubenswrapper[4764]: I0309 14:31:44.883524 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_d9226790-b0dc-460b-8c06-127effde8c19/nova-metadata-log/0.log" Mar 09 14:31:45 crc kubenswrapper[4764]: I0309 14:31:45.140877 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_7d26ba33-e370-4bc8-bb15-b727c0c9c97f/nova-scheduler-scheduler/0.log" Mar 09 14:31:45 crc kubenswrapper[4764]: I0309 14:31:45.263397 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_103cd40b-aa84-4973-8e47-8a67e5994c80/mysql-bootstrap/0.log" Mar 09 14:31:45 crc kubenswrapper[4764]: I0309 14:31:45.467191 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_103cd40b-aa84-4973-8e47-8a67e5994c80/galera/0.log" Mar 09 14:31:45 crc kubenswrapper[4764]: I0309 14:31:45.473418 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_103cd40b-aa84-4973-8e47-8a67e5994c80/mysql-bootstrap/0.log" Mar 09 14:31:45 crc kubenswrapper[4764]: I0309 14:31:45.746893 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0c87ed75-4285-4084-bce3-ee8dba7671c0/mysql-bootstrap/0.log" Mar 09 14:31:45 crc kubenswrapper[4764]: I0309 14:31:45.919388 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0c87ed75-4285-4084-bce3-ee8dba7671c0/mysql-bootstrap/0.log" Mar 09 14:31:45 crc kubenswrapper[4764]: I0309 14:31:45.962100 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0c87ed75-4285-4084-bce3-ee8dba7671c0/galera/0.log" Mar 09 14:31:46 crc kubenswrapper[4764]: I0309 14:31:46.107806 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_d82ed357-9f4c-478b-b893-ab6ff10fc83c/openstackclient/0.log" Mar 09 14:31:46 crc kubenswrapper[4764]: I0309 14:31:46.295505 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8ctgr_4db14a6b-d372-48be-86a1-bf651618b4a4/openstack-network-exporter/0.log" Mar 09 14:31:46 crc kubenswrapper[4764]: I0309 14:31:46.474369 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2zkzm_05f9485e-b683-481d-87d3-fb86ebb4a832/ovsdb-server-init/0.log" Mar 09 14:31:46 crc kubenswrapper[4764]: I0309 14:31:46.608802 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_d9226790-b0dc-460b-8c06-127effde8c19/nova-metadata-metadata/0.log" Mar 09 14:31:46 crc kubenswrapper[4764]: I0309 14:31:46.675168 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2zkzm_05f9485e-b683-481d-87d3-fb86ebb4a832/ovs-vswitchd/0.log" Mar 09 14:31:46 crc kubenswrapper[4764]: I0309 14:31:46.689710 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2zkzm_05f9485e-b683-481d-87d3-fb86ebb4a832/ovsdb-server-init/0.log" Mar 09 14:31:46 crc kubenswrapper[4764]: I0309 14:31:46.729293 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2zkzm_05f9485e-b683-481d-87d3-fb86ebb4a832/ovsdb-server/0.log" Mar 09 14:31:46 crc kubenswrapper[4764]: I0309 14:31:46.902743 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-qm7vs_9bbe03cf-76d5-440a-903f-50c382aa3a4e/ovn-controller/0.log" Mar 09 14:31:47 crc kubenswrapper[4764]: I0309 14:31:47.098800 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-bkvx8_ede2526d-593a-4258-9ec2-172270be638a/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:47 crc kubenswrapper[4764]: I0309 14:31:47.152088 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_142a1ef0-f024-4a81-85de-72435cd72d9e/openstack-network-exporter/0.log" Mar 09 14:31:47 crc kubenswrapper[4764]: I0309 14:31:47.336772 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_142a1ef0-f024-4a81-85de-72435cd72d9e/ovn-northd/0.log" Mar 09 14:31:47 crc kubenswrapper[4764]: I0309 14:31:47.394281 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e54bd06b-1ee2-452d-80fb-12fd4fb61c7b/openstack-network-exporter/0.log" Mar 09 14:31:47 crc kubenswrapper[4764]: I0309 14:31:47.551886 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e54bd06b-1ee2-452d-80fb-12fd4fb61c7b/ovsdbserver-nb/0.log" Mar 09 14:31:47 crc kubenswrapper[4764]: I0309 14:31:47.651631 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_047aa387-9e35-4ec6-89a9-3be60e47610b/openstack-network-exporter/0.log" Mar 09 14:31:47 crc kubenswrapper[4764]: I0309 14:31:47.689329 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_047aa387-9e35-4ec6-89a9-3be60e47610b/ovsdbserver-sb/0.log" Mar 09 14:31:48 crc kubenswrapper[4764]: I0309 14:31:48.013508 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-f85c59cb-gm4df_2a26a533-a42a-4553-96b3-922ad860ca7a/placement-log/0.log" Mar 09 14:31:48 crc kubenswrapper[4764]: I0309 14:31:48.064335 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-f85c59cb-gm4df_2a26a533-a42a-4553-96b3-922ad860ca7a/placement-api/0.log" Mar 09 14:31:48 crc kubenswrapper[4764]: I0309 14:31:48.191227 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c5579dd7-5380-4042-8c78-c6837d841d5e/setup-container/0.log" Mar 09 14:31:48 crc kubenswrapper[4764]: I0309 14:31:48.399850 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c5579dd7-5380-4042-8c78-c6837d841d5e/setup-container/0.log" Mar 09 14:31:48 crc kubenswrapper[4764]: I0309 14:31:48.422992 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c5579dd7-5380-4042-8c78-c6837d841d5e/rabbitmq/0.log" Mar 09 14:31:48 crc kubenswrapper[4764]: I0309 14:31:48.463604 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b19144b6-cc4c-41d6-ad2e-409c021f657c/setup-container/0.log" Mar 09 14:31:49 crc kubenswrapper[4764]: I0309 14:31:49.113942 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b19144b6-cc4c-41d6-ad2e-409c021f657c/setup-container/0.log" Mar 09 14:31:49 crc kubenswrapper[4764]: I0309 14:31:49.171884 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg_6ee5c8cc-9f2b-42f8-aed5-37c3540bd300/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:49 crc kubenswrapper[4764]: I0309 14:31:49.256998 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b19144b6-cc4c-41d6-ad2e-409c021f657c/rabbitmq/0.log" Mar 09 14:31:49 crc kubenswrapper[4764]: I0309 14:31:49.492406 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5_de1bf125-47e1-499c-9cfe-ffbd5c03d194/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:49 crc kubenswrapper[4764]: I0309 14:31:49.587573 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-c4rw6_942b7017-cdda-4d7a-8be8-521111f4fcd1/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:49 crc kubenswrapper[4764]: I0309 14:31:49.704322 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-x85q2_23319545-4107-4a83-b7e1-955e4648bf7b/ssh-known-hosts-edpm-deployment/0.log" Mar 09 14:31:49 crc kubenswrapper[4764]: I0309 14:31:49.939340 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_5db22a0e-ee1a-4b26-9e49-b26644266834/tempest-tests-tempest-tests-runner/0.log" Mar 09 14:31:50 crc kubenswrapper[4764]: I0309 14:31:50.040818 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_3b233056-629a-4653-8726-76e6b231e58b/test-operator-logs-container/0.log" Mar 09 14:31:50 crc kubenswrapper[4764]: I0309 14:31:50.234598 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m_93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.162124 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551112-lcwnw"] Mar 09 14:32:00 crc kubenswrapper[4764]: E0309 14:32:00.163661 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f041da-e1c7-4f92-b9b0-7c7b0495ecb6" containerName="container-00" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.163680 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f041da-e1c7-4f92-b9b0-7c7b0495ecb6" containerName="container-00" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.164840 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f041da-e1c7-4f92-b9b0-7c7b0495ecb6" containerName="container-00" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.165826 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551112-lcwnw" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.172936 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.173114 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.180911 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.194813 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551112-lcwnw"] Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.282084 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvspg\" (UniqueName: \"kubernetes.io/projected/3078e21d-b42c-45f0-94c0-d3980ec27f1f-kube-api-access-qvspg\") pod \"auto-csr-approver-29551112-lcwnw\" (UID: \"3078e21d-b42c-45f0-94c0-d3980ec27f1f\") " pod="openshift-infra/auto-csr-approver-29551112-lcwnw" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.384364 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvspg\" (UniqueName: \"kubernetes.io/projected/3078e21d-b42c-45f0-94c0-d3980ec27f1f-kube-api-access-qvspg\") pod \"auto-csr-approver-29551112-lcwnw\" (UID: \"3078e21d-b42c-45f0-94c0-d3980ec27f1f\") " pod="openshift-infra/auto-csr-approver-29551112-lcwnw" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.593816 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvspg\" (UniqueName: \"kubernetes.io/projected/3078e21d-b42c-45f0-94c0-d3980ec27f1f-kube-api-access-qvspg\") pod \"auto-csr-approver-29551112-lcwnw\" (UID: \"3078e21d-b42c-45f0-94c0-d3980ec27f1f\") " pod="openshift-infra/auto-csr-approver-29551112-lcwnw" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.808040 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551112-lcwnw" Mar 09 14:32:01 crc kubenswrapper[4764]: I0309 14:32:01.372212 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551112-lcwnw"] Mar 09 14:32:01 crc kubenswrapper[4764]: I0309 14:32:01.610148 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551112-lcwnw" event={"ID":"3078e21d-b42c-45f0-94c0-d3980ec27f1f","Type":"ContainerStarted","Data":"7d6114e8bf85a2fac5ce54e9eb8aceb55e8157c40da54ae189ea5e91258cd946"} Mar 09 14:32:04 crc kubenswrapper[4764]: I0309 14:32:04.651133 4764 generic.go:334] "Generic (PLEG): container finished" podID="3078e21d-b42c-45f0-94c0-d3980ec27f1f" containerID="6874ccb9508c9920d5940aa912a69fc71adf07efd3d1ffdbd80a9373286c8c70" exitCode=0 Mar 09 14:32:04 crc kubenswrapper[4764]: I0309 14:32:04.651249 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551112-lcwnw" event={"ID":"3078e21d-b42c-45f0-94c0-d3980ec27f1f","Type":"ContainerDied","Data":"6874ccb9508c9920d5940aa912a69fc71adf07efd3d1ffdbd80a9373286c8c70"} Mar 09 14:32:06 crc kubenswrapper[4764]: I0309 14:32:06.111204 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551112-lcwnw" Mar 09 14:32:06 crc kubenswrapper[4764]: I0309 14:32:06.266238 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvspg\" (UniqueName: \"kubernetes.io/projected/3078e21d-b42c-45f0-94c0-d3980ec27f1f-kube-api-access-qvspg\") pod \"3078e21d-b42c-45f0-94c0-d3980ec27f1f\" (UID: \"3078e21d-b42c-45f0-94c0-d3980ec27f1f\") " Mar 09 14:32:06 crc kubenswrapper[4764]: I0309 14:32:06.274400 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3078e21d-b42c-45f0-94c0-d3980ec27f1f-kube-api-access-qvspg" (OuterVolumeSpecName: "kube-api-access-qvspg") pod "3078e21d-b42c-45f0-94c0-d3980ec27f1f" (UID: "3078e21d-b42c-45f0-94c0-d3980ec27f1f"). InnerVolumeSpecName "kube-api-access-qvspg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:32:06 crc kubenswrapper[4764]: I0309 14:32:06.371574 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvspg\" (UniqueName: \"kubernetes.io/projected/3078e21d-b42c-45f0-94c0-d3980ec27f1f-kube-api-access-qvspg\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:06 crc kubenswrapper[4764]: I0309 14:32:06.716186 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551112-lcwnw" event={"ID":"3078e21d-b42c-45f0-94c0-d3980ec27f1f","Type":"ContainerDied","Data":"7d6114e8bf85a2fac5ce54e9eb8aceb55e8157c40da54ae189ea5e91258cd946"} Mar 09 14:32:06 crc kubenswrapper[4764]: I0309 14:32:06.716251 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d6114e8bf85a2fac5ce54e9eb8aceb55e8157c40da54ae189ea5e91258cd946" Mar 09 14:32:06 crc kubenswrapper[4764]: I0309 14:32:06.716346 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551112-lcwnw" Mar 09 14:32:07 crc kubenswrapper[4764]: I0309 14:32:07.229806 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551106-dnhln"] Mar 09 14:32:07 crc kubenswrapper[4764]: I0309 14:32:07.244213 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551106-dnhln"] Mar 09 14:32:07 crc kubenswrapper[4764]: I0309 14:32:07.572907 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d33fc9b0-e440-4f1b-9522-1abec06eca2a" path="/var/lib/kubelet/pods/d33fc9b0-e440-4f1b-9522-1abec06eca2a/volumes" Mar 09 14:32:07 crc kubenswrapper[4764]: I0309 14:32:07.609702 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_519ac270-ea24-47c1-b4f3-d94b0add96d1/memcached/0.log" Mar 09 14:32:12 crc kubenswrapper[4764]: I0309 14:32:12.253634 4764 scope.go:117] "RemoveContainer" containerID="8f9d999651aa510edfbd48cd8433fc728bf24d4c32451ce578a41c04f581a328" Mar 09 14:32:23 crc kubenswrapper[4764]: I0309 14:32:23.044888 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5_3f89e888-fc0d-48c0-ad4c-978e058ffebd/util/0.log" Mar 09 14:32:23 crc kubenswrapper[4764]: I0309 14:32:23.307228 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5_3f89e888-fc0d-48c0-ad4c-978e058ffebd/pull/0.log" Mar 09 14:32:23 crc kubenswrapper[4764]: I0309 14:32:23.352898 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5_3f89e888-fc0d-48c0-ad4c-978e058ffebd/util/0.log" Mar 09 14:32:23 crc kubenswrapper[4764]: I0309 14:32:23.388266 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5_3f89e888-fc0d-48c0-ad4c-978e058ffebd/pull/0.log" Mar 09 14:32:23 crc kubenswrapper[4764]: I0309 14:32:23.584738 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5_3f89e888-fc0d-48c0-ad4c-978e058ffebd/pull/0.log" Mar 09 14:32:23 crc kubenswrapper[4764]: I0309 14:32:23.626984 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5_3f89e888-fc0d-48c0-ad4c-978e058ffebd/util/0.log" Mar 09 14:32:23 crc kubenswrapper[4764]: I0309 14:32:23.689084 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5_3f89e888-fc0d-48c0-ad4c-978e058ffebd/extract/0.log" Mar 09 14:32:24 crc kubenswrapper[4764]: I0309 14:32:24.733833 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-5d87c9d997-cmtpc_725c0dd0-07d1-4a1c-b223-e8bec76cc7ff/manager/0.log" Mar 09 14:32:25 crc kubenswrapper[4764]: I0309 14:32:25.151952 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-mjf6m_488ff419-d889-4778-96cf-a11006c49507/manager/0.log" Mar 09 14:32:25 crc kubenswrapper[4764]: I0309 14:32:25.331219 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-jnmbv_7295db10-1c36-4c17-bf1e-4c4a702c201b/manager/0.log" Mar 09 14:32:25 crc kubenswrapper[4764]: I0309 14:32:25.624774 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-78bc7f9bd9-5xc2s_3da43711-be34-4189-b686-e8e9bc9e7265/manager/0.log" Mar 09 14:32:26 crc kubenswrapper[4764]: I0309 14:32:26.245765 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-hvpbz_5bd11a3c-79f3-4aa2-9d5a-78ec3c18cb31/manager/0.log" Mar 09 14:32:26 crc kubenswrapper[4764]: I0309 14:32:26.519317 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-m58s9_bfda7896-83e3-407c-9eb5-74fbc11104f0/manager/0.log" Mar 09 14:32:26 crc kubenswrapper[4764]: I0309 14:32:26.745123 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-nppjq_4c271ca0-0c25-46d1-b730-e94f68397e29/manager/0.log" Mar 09 14:32:27 crc kubenswrapper[4764]: I0309 14:32:27.265495 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-wv2rp_32eb5815-c566-4177-8b47-f756807d4a30/manager/0.log" Mar 09 14:32:27 crc kubenswrapper[4764]: I0309 14:32:27.393103 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c7bcbc569-qhpvs_5cd7eb92-2fae-4978-a5e9-58fa87c63e84/manager/0.log" Mar 09 14:32:27 crc kubenswrapper[4764]: I0309 14:32:27.580720 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-vkns5_da851ddd-2b27-45f0-b149-de32ae21ad91/manager/0.log" Mar 09 14:32:27 crc kubenswrapper[4764]: I0309 14:32:27.771231 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54688575f-cgv66_2ddf1e89-9c89-4052-aa1b-6fb84438b86d/manager/0.log" Mar 09 14:32:27 crc kubenswrapper[4764]: I0309 14:32:27.903282 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74b6b5dc96-dm7rn_26535a82-8d70-4623-b2b4-7dd1546d48d6/manager/0.log" Mar 09 14:32:28 crc kubenswrapper[4764]: I0309 14:32:28.053601 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-gv2sm_b54e2237-603a-44ad-a129-04736cf749b2/manager/0.log" Mar 09 14:32:28 crc kubenswrapper[4764]: I0309 14:32:28.149941 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm_47bd7072-a414-4ce8-800b-753b7054be23/manager/0.log" Mar 09 14:32:28 crc kubenswrapper[4764]: I0309 14:32:28.370390 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:32:28 crc kubenswrapper[4764]: I0309 14:32:28.370473 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:32:28 crc kubenswrapper[4764]: I0309 14:32:28.540790 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6754b7f846-ns9zn_67c57635-59f1-48a2-9823-c86732eabbf6/operator/0.log" Mar 09 14:32:28 crc kubenswrapper[4764]: I0309 14:32:28.624999 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-lvrg9_8e6c087a-8aaa-427c-822b-a274e19cc440/registry-server/0.log" Mar 09 14:32:28 crc kubenswrapper[4764]: I0309 14:32:28.913364 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-jfgzw_615473d3-072e-4685-8f32-73a44badf1e2/manager/0.log" Mar 09 14:32:28 crc kubenswrapper[4764]: I0309 14:32:28.940257 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-648564c9fc-8ms5w_c44e76b2-0de9-4a5b-93ee-536c6300157f/manager/0.log" Mar 09 14:32:29 crc kubenswrapper[4764]: I0309 14:32:29.288827 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-6v2sq_01ea99aa-eb21-4799-9557-42c3fb55945a/operator/0.log" Mar 09 14:32:29 crc kubenswrapper[4764]: I0309 14:32:29.389102 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-bf8w8_003210d3-5572-44bd-aae5-d5e24aac16a5/manager/0.log" Mar 09 14:32:29 crc kubenswrapper[4764]: I0309 14:32:29.696022 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5fdb694969-4cpsz_c6a51c61-770b-4b46-9dd3-1f61e6f5ebc8/manager/0.log" Mar 09 14:32:29 crc kubenswrapper[4764]: I0309 14:32:29.766173 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-d65xp_867908a2-f085-4f3d-b569-84c915f730b1/manager/0.log" Mar 09 14:32:29 crc kubenswrapper[4764]: I0309 14:32:29.973321 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-7f8nr_f705ec78-e960-4200-b5a6-f3d4310f1bd5/manager/0.log" Mar 09 14:32:30 crc kubenswrapper[4764]: I0309 14:32:30.608935 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6746d697b-lr6nx_e11f44d8-58a5-4fc7-b05b-e2e688647d01/manager/0.log" Mar 09 14:32:34 crc kubenswrapper[4764]: I0309 14:32:34.306442 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6db6876945-82cg8_e220a3f1-4dbe-4ee6-9b19-26985fa998cf/manager/0.log" Mar 09 14:32:54 crc kubenswrapper[4764]: I0309 14:32:54.525064 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-9k28f_4ba55602-0e3f-4722-b437-546732351bc4/control-plane-machine-set-operator/0.log" Mar 09 14:32:54 crc kubenswrapper[4764]: I0309 14:32:54.776547 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t8ft9_4125448d-5832-43c2-8dba-d95adde7458a/kube-rbac-proxy/0.log" Mar 09 14:32:54 crc kubenswrapper[4764]: I0309 14:32:54.776846 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t8ft9_4125448d-5832-43c2-8dba-d95adde7458a/machine-api-operator/0.log" Mar 09 14:32:58 crc kubenswrapper[4764]: I0309 14:32:58.370781 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:32:58 crc kubenswrapper[4764]: I0309 14:32:58.371736 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:33:10 crc kubenswrapper[4764]: I0309 14:33:10.823750 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-lhpfw_2eef62f2-5973-47e2-b921-9e1a05b9f8fb/cert-manager-controller/0.log" Mar 09 14:33:11 crc kubenswrapper[4764]: I0309 14:33:11.485510 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-qr7fv_0a35f012-3965-4680-aa01-9fa97f956c68/cert-manager-cainjector/0.log" Mar 09 14:33:11 crc kubenswrapper[4764]: I0309 14:33:11.655431 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-j8rlp_09aeffa2-590d-4062-95ff-40dbdda54df7/cert-manager-webhook/0.log" Mar 09 14:33:27 crc kubenswrapper[4764]: I0309 14:33:27.493788 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-qpjz4_fc521772-06d5-47ec-85d0-6162bb98af30/nmstate-console-plugin/0.log" Mar 09 14:33:27 crc kubenswrapper[4764]: I0309 14:33:27.808469 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-sl5hn_6dc5759c-db8c-4025-bc16-a07e4dc6278a/nmstate-handler/0.log" Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.179816 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-jschs_abca721f-d47f-4e38-ab9e-0832de2c70e6/nmstate-metrics/0.log" Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.222304 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-jschs_abca721f-d47f-4e38-ab9e-0832de2c70e6/kube-rbac-proxy/0.log" Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.333320 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-q5kvf_33e9b814-6368-46c6-aae2-5a3df1839d29/nmstate-operator/0.log" Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.397694 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.398069 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.398221 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.399302 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.399449 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" gracePeriod=600 Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.492839 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-wv755_f339e495-f347-45b8-b9da-2cd832ac4300/nmstate-webhook/0.log" Mar 09 14:33:28 crc kubenswrapper[4764]: E0309 14:33:28.523754 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.562655 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" exitCode=0 Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.562729 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2"} Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.562879 4764 scope.go:117] "RemoveContainer" containerID="956798d3640b6af2f25ed443a7b5b3c26e0da670aff756c574b1a30cc50879dd" Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.564085 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:33:28 crc kubenswrapper[4764]: E0309 14:33:28.564462 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:33:42 crc kubenswrapper[4764]: I0309 14:33:42.560407 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:33:42 crc kubenswrapper[4764]: E0309 14:33:42.561520 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.776479 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qzlmx"] Mar 09 14:33:43 crc kubenswrapper[4764]: E0309 14:33:43.777470 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3078e21d-b42c-45f0-94c0-d3980ec27f1f" containerName="oc" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.777489 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3078e21d-b42c-45f0-94c0-d3980ec27f1f" containerName="oc" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.777725 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3078e21d-b42c-45f0-94c0-d3980ec27f1f" containerName="oc" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.805828 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qzlmx"] Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.805994 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.825637 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-catalog-content\") pod \"redhat-operators-qzlmx\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.825820 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-utilities\") pod \"redhat-operators-qzlmx\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.828035 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv8kk\" (UniqueName: \"kubernetes.io/projected/f2efcd9d-6a11-4afd-8903-0f280284cdaa-kube-api-access-lv8kk\") pod \"redhat-operators-qzlmx\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.930448 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-catalog-content\") pod \"redhat-operators-qzlmx\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.930838 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-utilities\") pod \"redhat-operators-qzlmx\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.930950 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv8kk\" (UniqueName: \"kubernetes.io/projected/f2efcd9d-6a11-4afd-8903-0f280284cdaa-kube-api-access-lv8kk\") pod \"redhat-operators-qzlmx\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.931411 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-catalog-content\") pod \"redhat-operators-qzlmx\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.931620 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-utilities\") pod \"redhat-operators-qzlmx\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.955237 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv8kk\" (UniqueName: \"kubernetes.io/projected/f2efcd9d-6a11-4afd-8903-0f280284cdaa-kube-api-access-lv8kk\") pod \"redhat-operators-qzlmx\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:44 crc kubenswrapper[4764]: I0309 14:33:44.157685 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:44 crc kubenswrapper[4764]: I0309 14:33:44.775672 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qzlmx"] Mar 09 14:33:45 crc kubenswrapper[4764]: I0309 14:33:45.740530 4764 generic.go:334] "Generic (PLEG): container finished" podID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerID="edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511" exitCode=0 Mar 09 14:33:45 crc kubenswrapper[4764]: I0309 14:33:45.740918 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzlmx" event={"ID":"f2efcd9d-6a11-4afd-8903-0f280284cdaa","Type":"ContainerDied","Data":"edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511"} Mar 09 14:33:45 crc kubenswrapper[4764]: I0309 14:33:45.740953 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzlmx" event={"ID":"f2efcd9d-6a11-4afd-8903-0f280284cdaa","Type":"ContainerStarted","Data":"e9d31a776ca5ada6ff61eb8dd1dad9ae7a8028937181ba856028bc7288265bfe"} Mar 09 14:33:47 crc kubenswrapper[4764]: I0309 14:33:47.765891 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzlmx" event={"ID":"f2efcd9d-6a11-4afd-8903-0f280284cdaa","Type":"ContainerStarted","Data":"86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe"} Mar 09 14:33:49 crc kubenswrapper[4764]: I0309 14:33:49.786322 4764 generic.go:334] "Generic (PLEG): container finished" podID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerID="86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe" exitCode=0 Mar 09 14:33:49 crc kubenswrapper[4764]: I0309 14:33:49.786408 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzlmx" event={"ID":"f2efcd9d-6a11-4afd-8903-0f280284cdaa","Type":"ContainerDied","Data":"86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe"} Mar 09 14:33:50 crc kubenswrapper[4764]: I0309 14:33:50.803440 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzlmx" event={"ID":"f2efcd9d-6a11-4afd-8903-0f280284cdaa","Type":"ContainerStarted","Data":"6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7"} Mar 09 14:33:53 crc kubenswrapper[4764]: I0309 14:33:53.560501 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:33:53 crc kubenswrapper[4764]: E0309 14:33:53.561411 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:33:54 crc kubenswrapper[4764]: I0309 14:33:54.157858 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:54 crc kubenswrapper[4764]: I0309 14:33:54.158363 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:55 crc kubenswrapper[4764]: I0309 14:33:55.215694 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qzlmx" podUID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerName="registry-server" probeResult="failure" output=< Mar 09 14:33:55 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 09 14:33:55 crc kubenswrapper[4764]: > Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.144741 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qzlmx" podStartSLOduration=12.665717086 podStartE2EDuration="17.1447196s" podCreationTimestamp="2026-03-09 14:33:43 +0000 UTC" firstStartedPulling="2026-03-09 14:33:45.743481696 +0000 UTC m=+4380.993653604" lastFinishedPulling="2026-03-09 14:33:50.22248421 +0000 UTC m=+4385.472656118" observedRunningTime="2026-03-09 14:33:50.833136723 +0000 UTC m=+4386.083308631" watchObservedRunningTime="2026-03-09 14:34:00.1447196 +0000 UTC m=+4395.394891508" Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.150042 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551114-9w2wl"] Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.151581 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551114-9w2wl" Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.154092 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.155287 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.159780 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.162771 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551114-9w2wl"] Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.247532 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggjnq\" (UniqueName: \"kubernetes.io/projected/3b063d56-a0eb-4b2d-8b53-2c63feead99e-kube-api-access-ggjnq\") pod \"auto-csr-approver-29551114-9w2wl\" (UID: \"3b063d56-a0eb-4b2d-8b53-2c63feead99e\") " pod="openshift-infra/auto-csr-approver-29551114-9w2wl" Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.350330 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggjnq\" (UniqueName: \"kubernetes.io/projected/3b063d56-a0eb-4b2d-8b53-2c63feead99e-kube-api-access-ggjnq\") pod \"auto-csr-approver-29551114-9w2wl\" (UID: \"3b063d56-a0eb-4b2d-8b53-2c63feead99e\") " pod="openshift-infra/auto-csr-approver-29551114-9w2wl" Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.373849 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggjnq\" (UniqueName: \"kubernetes.io/projected/3b063d56-a0eb-4b2d-8b53-2c63feead99e-kube-api-access-ggjnq\") pod \"auto-csr-approver-29551114-9w2wl\" (UID: \"3b063d56-a0eb-4b2d-8b53-2c63feead99e\") " pod="openshift-infra/auto-csr-approver-29551114-9w2wl" Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.475222 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551114-9w2wl" Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.992454 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551114-9w2wl"] Mar 09 14:34:01 crc kubenswrapper[4764]: W0309 14:34:01.002826 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b063d56_a0eb_4b2d_8b53_2c63feead99e.slice/crio-4af4ce420cb2f651ee10eebaabb6884f6f65c51443c9a04bcf3b4f43c4049c40 WatchSource:0}: Error finding container 4af4ce420cb2f651ee10eebaabb6884f6f65c51443c9a04bcf3b4f43c4049c40: Status 404 returned error can't find the container with id 4af4ce420cb2f651ee10eebaabb6884f6f65c51443c9a04bcf3b4f43c4049c40 Mar 09 14:34:01 crc kubenswrapper[4764]: I0309 14:34:01.921262 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551114-9w2wl" event={"ID":"3b063d56-a0eb-4b2d-8b53-2c63feead99e","Type":"ContainerStarted","Data":"4af4ce420cb2f651ee10eebaabb6884f6f65c51443c9a04bcf3b4f43c4049c40"} Mar 09 14:34:03 crc kubenswrapper[4764]: I0309 14:34:03.685262 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-lgrkv_709e786e-5c7d-45d3-ac38-78351dfbec81/kube-rbac-proxy/0.log" Mar 09 14:34:03 crc kubenswrapper[4764]: I0309 14:34:03.817628 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-lgrkv_709e786e-5c7d-45d3-ac38-78351dfbec81/controller/0.log" Mar 09 14:34:03 crc kubenswrapper[4764]: I0309 14:34:03.946216 4764 generic.go:334] "Generic (PLEG): container finished" podID="3b063d56-a0eb-4b2d-8b53-2c63feead99e" containerID="ea38c17635079a211c8b307953c91fb9e37a06f4db03c9c46e225e504f02afec" exitCode=0 Mar 09 14:34:03 crc kubenswrapper[4764]: I0309 14:34:03.946267 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551114-9w2wl" event={"ID":"3b063d56-a0eb-4b2d-8b53-2c63feead99e","Type":"ContainerDied","Data":"ea38c17635079a211c8b307953c91fb9e37a06f4db03c9c46e225e504f02afec"} Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.020250 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-frr-files/0.log" Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.206534 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-metrics/0.log" Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.216533 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-frr-files/0.log" Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.224430 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.252060 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-reloader/0.log" Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.283302 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.304876 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-reloader/0.log" Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.475049 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qzlmx"] Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.775917 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-reloader/0.log" Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.850736 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-frr-files/0.log" Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.863356 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-metrics/0.log" Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.877933 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-metrics/0.log" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.141590 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-reloader/0.log" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.206398 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-metrics/0.log" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.210441 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/controller/0.log" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.217692 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-frr-files/0.log" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.381477 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551114-9w2wl" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.488200 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggjnq\" (UniqueName: \"kubernetes.io/projected/3b063d56-a0eb-4b2d-8b53-2c63feead99e-kube-api-access-ggjnq\") pod \"3b063d56-a0eb-4b2d-8b53-2c63feead99e\" (UID: \"3b063d56-a0eb-4b2d-8b53-2c63feead99e\") " Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.525931 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b063d56-a0eb-4b2d-8b53-2c63feead99e-kube-api-access-ggjnq" (OuterVolumeSpecName: "kube-api-access-ggjnq") pod "3b063d56-a0eb-4b2d-8b53-2c63feead99e" (UID: "3b063d56-a0eb-4b2d-8b53-2c63feead99e"). InnerVolumeSpecName "kube-api-access-ggjnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.580752 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/kube-rbac-proxy/0.log" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.590240 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/kube-rbac-proxy-frr/0.log" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.593060 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggjnq\" (UniqueName: \"kubernetes.io/projected/3b063d56-a0eb-4b2d-8b53-2c63feead99e-kube-api-access-ggjnq\") on node \"crc\" DevicePath \"\"" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.600541 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/frr-metrics/0.log" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.813653 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/reloader/0.log" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.883703 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-wqd8z_72efa175-2568-4c62-a97e-35893887fe82/frr-k8s-webhook-server/0.log" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.973118 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qzlmx" podUID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerName="registry-server" containerID="cri-o://6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7" gracePeriod=2 Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.973620 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551114-9w2wl" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.973674 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551114-9w2wl" event={"ID":"3b063d56-a0eb-4b2d-8b53-2c63feead99e","Type":"ContainerDied","Data":"4af4ce420cb2f651ee10eebaabb6884f6f65c51443c9a04bcf3b4f43c4049c40"} Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.973729 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4af4ce420cb2f651ee10eebaabb6884f6f65c51443c9a04bcf3b4f43c4049c40" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.285109 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-659b995f59-8s255_b89770ec-e502-4b3a-8233-8c9aa76d55de/manager/0.log" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.467243 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7bccb4c96-wqdrv_ed37a5d1-5d4b-41fb-8476-189def32c909/webhook-server/0.log" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.468001 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551108-8tqjz"] Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.483326 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551108-8tqjz"] Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.536818 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.573714 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:34:06 crc kubenswrapper[4764]: E0309 14:34:06.574533 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.631465 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv8kk\" (UniqueName: \"kubernetes.io/projected/f2efcd9d-6a11-4afd-8903-0f280284cdaa-kube-api-access-lv8kk\") pod \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.631615 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-utilities\") pod \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.631897 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-catalog-content\") pod \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.634274 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-utilities" (OuterVolumeSpecName: "utilities") pod "f2efcd9d-6a11-4afd-8903-0f280284cdaa" (UID: "f2efcd9d-6a11-4afd-8903-0f280284cdaa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.642036 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2efcd9d-6a11-4afd-8903-0f280284cdaa-kube-api-access-lv8kk" (OuterVolumeSpecName: "kube-api-access-lv8kk") pod "f2efcd9d-6a11-4afd-8903-0f280284cdaa" (UID: "f2efcd9d-6a11-4afd-8903-0f280284cdaa"). InnerVolumeSpecName "kube-api-access-lv8kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.734109 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lv8kk\" (UniqueName: \"kubernetes.io/projected/f2efcd9d-6a11-4afd-8903-0f280284cdaa-kube-api-access-lv8kk\") on node \"crc\" DevicePath \"\"" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.734559 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.757617 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2z5wp_bfd899d4-a0df-47e3-aa36-1cf690235c45/kube-rbac-proxy/0.log" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.784310 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2efcd9d-6a11-4afd-8903-0f280284cdaa" (UID: "f2efcd9d-6a11-4afd-8903-0f280284cdaa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.838424 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.986759 4764 generic.go:334] "Generic (PLEG): container finished" podID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerID="6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7" exitCode=0 Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.986819 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzlmx" event={"ID":"f2efcd9d-6a11-4afd-8903-0f280284cdaa","Type":"ContainerDied","Data":"6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7"} Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.986858 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzlmx" event={"ID":"f2efcd9d-6a11-4afd-8903-0f280284cdaa","Type":"ContainerDied","Data":"e9d31a776ca5ada6ff61eb8dd1dad9ae7a8028937181ba856028bc7288265bfe"} Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.986881 4764 scope.go:117] "RemoveContainer" containerID="6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.986901 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.013928 4764 scope.go:117] "RemoveContainer" containerID="86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.045028 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qzlmx"] Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.058934 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qzlmx"] Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.060054 4764 scope.go:117] "RemoveContainer" containerID="edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.121456 4764 scope.go:117] "RemoveContainer" containerID="6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7" Mar 09 14:34:07 crc kubenswrapper[4764]: E0309 14:34:07.122072 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7\": container with ID starting with 6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7 not found: ID does not exist" containerID="6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.122122 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7"} err="failed to get container status \"6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7\": rpc error: code = NotFound desc = could not find container \"6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7\": container with ID starting with 6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7 not found: ID does not exist" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.122156 4764 scope.go:117] "RemoveContainer" containerID="86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe" Mar 09 14:34:07 crc kubenswrapper[4764]: E0309 14:34:07.122877 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe\": container with ID starting with 86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe not found: ID does not exist" containerID="86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.122907 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe"} err="failed to get container status \"86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe\": rpc error: code = NotFound desc = could not find container \"86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe\": container with ID starting with 86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe not found: ID does not exist" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.122928 4764 scope.go:117] "RemoveContainer" containerID="edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511" Mar 09 14:34:07 crc kubenswrapper[4764]: E0309 14:34:07.123194 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511\": container with ID starting with edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511 not found: ID does not exist" containerID="edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.123223 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511"} err="failed to get container status \"edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511\": rpc error: code = NotFound desc = could not find container \"edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511\": container with ID starting with edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511 not found: ID does not exist" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.312266 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/frr/0.log" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.359591 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2z5wp_bfd899d4-a0df-47e3-aa36-1cf690235c45/speaker/0.log" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.575474 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f49d558-1317-4099-abb0-bb57895b3917" path="/var/lib/kubelet/pods/1f49d558-1317-4099-abb0-bb57895b3917/volumes" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.576291 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" path="/var/lib/kubelet/pods/f2efcd9d-6a11-4afd-8903-0f280284cdaa/volumes" Mar 09 14:34:12 crc kubenswrapper[4764]: I0309 14:34:12.794400 4764 scope.go:117] "RemoveContainer" containerID="3d8746bde1498da01929d78c37701a3fb1698a0044a00e71ddbcca0d4465fb3c" Mar 09 14:34:17 crc kubenswrapper[4764]: I0309 14:34:17.560763 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:34:17 crc kubenswrapper[4764]: E0309 14:34:17.561860 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.007282 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_704ddae7-42eb-4609-b4a3-64d5078c2126/util/0.log" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.307040 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_704ddae7-42eb-4609-b4a3-64d5078c2126/util/0.log" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.323239 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_704ddae7-42eb-4609-b4a3-64d5078c2126/pull/0.log" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.369751 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_704ddae7-42eb-4609-b4a3-64d5078c2126/pull/0.log" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.495513 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_704ddae7-42eb-4609-b4a3-64d5078c2126/util/0.log" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.529663 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_704ddae7-42eb-4609-b4a3-64d5078c2126/pull/0.log" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.544817 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_704ddae7-42eb-4609-b4a3-64d5078c2126/extract/0.log" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.718040 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bbwx5_a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa/extract-utilities/0.log" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.929680 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bbwx5_a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa/extract-utilities/0.log" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.959789 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bbwx5_a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa/extract-content/0.log" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.965424 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bbwx5_a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa/extract-content/0.log" Mar 09 14:34:23 crc kubenswrapper[4764]: I0309 14:34:23.149920 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bbwx5_a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa/extract-content/0.log" Mar 09 14:34:23 crc kubenswrapper[4764]: I0309 14:34:23.150765 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bbwx5_a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa/extract-utilities/0.log" Mar 09 14:34:23 crc kubenswrapper[4764]: I0309 14:34:23.437170 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ggs99_176982f0-3e86-471c-8054-13490ee485bb/extract-utilities/0.log" Mar 09 14:34:23 crc kubenswrapper[4764]: I0309 14:34:23.672680 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ggs99_176982f0-3e86-471c-8054-13490ee485bb/extract-content/0.log" Mar 09 14:34:23 crc kubenswrapper[4764]: I0309 14:34:23.691895 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ggs99_176982f0-3e86-471c-8054-13490ee485bb/extract-utilities/0.log" Mar 09 14:34:23 crc kubenswrapper[4764]: I0309 14:34:23.731984 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ggs99_176982f0-3e86-471c-8054-13490ee485bb/extract-content/0.log" Mar 09 14:34:23 crc kubenswrapper[4764]: I0309 14:34:23.984036 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ggs99_176982f0-3e86-471c-8054-13490ee485bb/extract-utilities/0.log" Mar 09 14:34:23 crc kubenswrapper[4764]: I0309 14:34:23.991825 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ggs99_176982f0-3e86-471c-8054-13490ee485bb/extract-content/0.log" Mar 09 14:34:24 crc kubenswrapper[4764]: I0309 14:34:24.031295 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bbwx5_a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa/registry-server/0.log" Mar 09 14:34:24 crc kubenswrapper[4764]: I0309 14:34:24.192468 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ggs99_176982f0-3e86-471c-8054-13490ee485bb/registry-server/0.log" Mar 09 14:34:24 crc kubenswrapper[4764]: I0309 14:34:24.216282 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg_413f45cc-5916-45bb-a2a2-7b33029445af/util/0.log" Mar 09 14:34:24 crc kubenswrapper[4764]: I0309 14:34:24.505434 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg_413f45cc-5916-45bb-a2a2-7b33029445af/pull/0.log" Mar 09 14:34:24 crc kubenswrapper[4764]: I0309 14:34:24.515452 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg_413f45cc-5916-45bb-a2a2-7b33029445af/util/0.log" Mar 09 14:34:24 crc kubenswrapper[4764]: I0309 14:34:24.561829 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg_413f45cc-5916-45bb-a2a2-7b33029445af/pull/0.log" Mar 09 14:34:24 crc kubenswrapper[4764]: I0309 14:34:24.716375 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg_413f45cc-5916-45bb-a2a2-7b33029445af/util/0.log" Mar 09 14:34:24 crc kubenswrapper[4764]: I0309 14:34:24.762337 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg_413f45cc-5916-45bb-a2a2-7b33029445af/pull/0.log" Mar 09 14:34:24 crc kubenswrapper[4764]: I0309 14:34:24.762584 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg_413f45cc-5916-45bb-a2a2-7b33029445af/extract/0.log" Mar 09 14:34:24 crc kubenswrapper[4764]: I0309 14:34:24.916448 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-m2gc7_4351c9fc-c207-4d15-b8a6-f51c0651fe83/marketplace-operator/0.log" Mar 09 14:34:25 crc kubenswrapper[4764]: I0309 14:34:25.029715 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-whs64_26dd13d2-9d2e-4f59-97a6-e31b76ccf74c/extract-utilities/0.log" Mar 09 14:34:25 crc kubenswrapper[4764]: I0309 14:34:25.198422 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-whs64_26dd13d2-9d2e-4f59-97a6-e31b76ccf74c/extract-utilities/0.log" Mar 09 14:34:25 crc kubenswrapper[4764]: I0309 14:34:25.225337 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-whs64_26dd13d2-9d2e-4f59-97a6-e31b76ccf74c/extract-content/0.log" Mar 09 14:34:25 crc kubenswrapper[4764]: I0309 14:34:25.263220 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-whs64_26dd13d2-9d2e-4f59-97a6-e31b76ccf74c/extract-content/0.log" Mar 09 14:34:25 crc kubenswrapper[4764]: I0309 14:34:25.975386 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-whs64_26dd13d2-9d2e-4f59-97a6-e31b76ccf74c/extract-utilities/0.log" Mar 09 14:34:25 crc kubenswrapper[4764]: I0309 14:34:25.995523 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-whs64_26dd13d2-9d2e-4f59-97a6-e31b76ccf74c/extract-content/0.log" Mar 09 14:34:26 crc kubenswrapper[4764]: I0309 14:34:26.158684 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-whs64_26dd13d2-9d2e-4f59-97a6-e31b76ccf74c/registry-server/0.log" Mar 09 14:34:26 crc kubenswrapper[4764]: I0309 14:34:26.219155 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xn8sz_621cdc4e-d896-4775-b654-2d6606097cb9/extract-utilities/0.log" Mar 09 14:34:26 crc kubenswrapper[4764]: I0309 14:34:26.493037 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xn8sz_621cdc4e-d896-4775-b654-2d6606097cb9/extract-content/0.log" Mar 09 14:34:26 crc kubenswrapper[4764]: I0309 14:34:26.502054 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xn8sz_621cdc4e-d896-4775-b654-2d6606097cb9/extract-utilities/0.log" Mar 09 14:34:26 crc kubenswrapper[4764]: I0309 14:34:26.508129 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xn8sz_621cdc4e-d896-4775-b654-2d6606097cb9/extract-content/0.log" Mar 09 14:34:26 crc kubenswrapper[4764]: I0309 14:34:26.683133 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xn8sz_621cdc4e-d896-4775-b654-2d6606097cb9/extract-utilities/0.log" Mar 09 14:34:26 crc kubenswrapper[4764]: I0309 14:34:26.711810 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xn8sz_621cdc4e-d896-4775-b654-2d6606097cb9/extract-content/0.log" Mar 09 14:34:27 crc kubenswrapper[4764]: I0309 14:34:27.519059 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xn8sz_621cdc4e-d896-4775-b654-2d6606097cb9/registry-server/0.log" Mar 09 14:34:28 crc kubenswrapper[4764]: I0309 14:34:28.560239 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:34:28 crc kubenswrapper[4764]: E0309 14:34:28.560679 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:34:42 crc kubenswrapper[4764]: I0309 14:34:42.560015 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:34:42 crc kubenswrapper[4764]: E0309 14:34:42.561312 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:34:57 crc kubenswrapper[4764]: I0309 14:34:57.560198 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:34:57 crc kubenswrapper[4764]: E0309 14:34:57.561149 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:35:09 crc kubenswrapper[4764]: I0309 14:35:09.560561 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:35:09 crc kubenswrapper[4764]: E0309 14:35:09.561830 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:35:21 crc kubenswrapper[4764]: I0309 14:35:21.561026 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:35:21 crc kubenswrapper[4764]: E0309 14:35:21.562289 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:35:34 crc kubenswrapper[4764]: I0309 14:35:34.560361 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:35:34 crc kubenswrapper[4764]: E0309 14:35:34.561586 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:35:45 crc kubenswrapper[4764]: I0309 14:35:45.569695 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:35:45 crc kubenswrapper[4764]: E0309 14:35:45.570801 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:35:57 crc kubenswrapper[4764]: I0309 14:35:57.560950 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:35:57 crc kubenswrapper[4764]: E0309 14:35:57.562031 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.157308 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551116-plhlc"] Mar 09 14:36:00 crc kubenswrapper[4764]: E0309 14:36:00.158780 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b063d56-a0eb-4b2d-8b53-2c63feead99e" containerName="oc" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.158803 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b063d56-a0eb-4b2d-8b53-2c63feead99e" containerName="oc" Mar 09 14:36:00 crc kubenswrapper[4764]: E0309 14:36:00.158828 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerName="extract-utilities" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.158840 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerName="extract-utilities" Mar 09 14:36:00 crc kubenswrapper[4764]: E0309 14:36:00.158858 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerName="registry-server" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.158866 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerName="registry-server" Mar 09 14:36:00 crc kubenswrapper[4764]: E0309 14:36:00.158884 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerName="extract-content" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.158894 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerName="extract-content" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.159251 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerName="registry-server" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.159276 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b063d56-a0eb-4b2d-8b53-2c63feead99e" containerName="oc" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.160391 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551116-plhlc" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.163021 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.163501 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.163612 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.174899 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551116-plhlc"] Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.256165 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88bdr\" (UniqueName: \"kubernetes.io/projected/25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6-kube-api-access-88bdr\") pod \"auto-csr-approver-29551116-plhlc\" (UID: \"25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6\") " pod="openshift-infra/auto-csr-approver-29551116-plhlc" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.359017 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88bdr\" (UniqueName: \"kubernetes.io/projected/25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6-kube-api-access-88bdr\") pod \"auto-csr-approver-29551116-plhlc\" (UID: \"25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6\") " pod="openshift-infra/auto-csr-approver-29551116-plhlc" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.388933 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88bdr\" (UniqueName: \"kubernetes.io/projected/25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6-kube-api-access-88bdr\") pod \"auto-csr-approver-29551116-plhlc\" (UID: \"25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6\") " pod="openshift-infra/auto-csr-approver-29551116-plhlc" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.487057 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551116-plhlc" Mar 09 14:36:01 crc kubenswrapper[4764]: I0309 14:36:01.005047 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551116-plhlc"] Mar 09 14:36:01 crc kubenswrapper[4764]: I0309 14:36:01.028603 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:36:01 crc kubenswrapper[4764]: I0309 14:36:01.199051 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551116-plhlc" event={"ID":"25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6","Type":"ContainerStarted","Data":"0839267fab67af7136fa26619fbb2672ecbc2c656e8177e7e69640966981f45a"} Mar 09 14:36:03 crc kubenswrapper[4764]: I0309 14:36:03.227918 4764 generic.go:334] "Generic (PLEG): container finished" podID="25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6" containerID="ea7fd0d4a7f1c688c8ce67675688ffbac7999aee93247d1046a8ee856d2e349c" exitCode=0 Mar 09 14:36:03 crc kubenswrapper[4764]: I0309 14:36:03.227993 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551116-plhlc" event={"ID":"25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6","Type":"ContainerDied","Data":"ea7fd0d4a7f1c688c8ce67675688ffbac7999aee93247d1046a8ee856d2e349c"} Mar 09 14:36:04 crc kubenswrapper[4764]: I0309 14:36:04.602127 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551116-plhlc" Mar 09 14:36:04 crc kubenswrapper[4764]: I0309 14:36:04.677552 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88bdr\" (UniqueName: \"kubernetes.io/projected/25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6-kube-api-access-88bdr\") pod \"25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6\" (UID: \"25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6\") " Mar 09 14:36:05 crc kubenswrapper[4764]: I0309 14:36:05.193005 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6-kube-api-access-88bdr" (OuterVolumeSpecName: "kube-api-access-88bdr") pod "25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6" (UID: "25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6"). InnerVolumeSpecName "kube-api-access-88bdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:36:05 crc kubenswrapper[4764]: I0309 14:36:05.249599 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551116-plhlc" Mar 09 14:36:05 crc kubenswrapper[4764]: I0309 14:36:05.249510 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551116-plhlc" event={"ID":"25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6","Type":"ContainerDied","Data":"0839267fab67af7136fa26619fbb2672ecbc2c656e8177e7e69640966981f45a"} Mar 09 14:36:05 crc kubenswrapper[4764]: I0309 14:36:05.261160 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0839267fab67af7136fa26619fbb2672ecbc2c656e8177e7e69640966981f45a" Mar 09 14:36:05 crc kubenswrapper[4764]: I0309 14:36:05.292108 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88bdr\" (UniqueName: \"kubernetes.io/projected/25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6-kube-api-access-88bdr\") on node \"crc\" DevicePath \"\"" Mar 09 14:36:05 crc kubenswrapper[4764]: I0309 14:36:05.685599 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551110-sjw8t"] Mar 09 14:36:05 crc kubenswrapper[4764]: I0309 14:36:05.694409 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551110-sjw8t"] Mar 09 14:36:07 crc kubenswrapper[4764]: I0309 14:36:07.571277 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44063b01-0b96-488c-98af-43cdb752467e" path="/var/lib/kubelet/pods/44063b01-0b96-488c-98af-43cdb752467e/volumes" Mar 09 14:36:12 crc kubenswrapper[4764]: I0309 14:36:12.560472 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:36:12 crc kubenswrapper[4764]: E0309 14:36:12.561708 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:36:12 crc kubenswrapper[4764]: I0309 14:36:12.935194 4764 scope.go:117] "RemoveContainer" containerID="b7f1a91e8b51914a12830dcf15f240318b51a9141210e70d3ec57af5cc977455" Mar 09 14:36:24 crc kubenswrapper[4764]: I0309 14:36:24.563295 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:36:24 crc kubenswrapper[4764]: E0309 14:36:24.564365 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:36:39 crc kubenswrapper[4764]: I0309 14:36:39.560835 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:36:39 crc kubenswrapper[4764]: E0309 14:36:39.562172 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:36:42 crc kubenswrapper[4764]: I0309 14:36:42.649533 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4069cd4-c4ea-4c35-a8e3-231f40655d27" containerID="3751c9a7f60cedf52e5109a5d05ced7856abcae8f6d88733462034a07970747a" exitCode=0 Mar 09 14:36:42 crc kubenswrapper[4764]: I0309 14:36:42.649721 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/must-gather-hmzh8" event={"ID":"f4069cd4-c4ea-4c35-a8e3-231f40655d27","Type":"ContainerDied","Data":"3751c9a7f60cedf52e5109a5d05ced7856abcae8f6d88733462034a07970747a"} Mar 09 14:36:42 crc kubenswrapper[4764]: I0309 14:36:42.651310 4764 scope.go:117] "RemoveContainer" containerID="3751c9a7f60cedf52e5109a5d05ced7856abcae8f6d88733462034a07970747a" Mar 09 14:36:42 crc kubenswrapper[4764]: I0309 14:36:42.798005 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2n85d_must-gather-hmzh8_f4069cd4-c4ea-4c35-a8e3-231f40655d27/gather/0.log" Mar 09 14:36:51 crc kubenswrapper[4764]: I0309 14:36:51.560346 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:36:51 crc kubenswrapper[4764]: E0309 14:36:51.561592 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:36:51 crc kubenswrapper[4764]: I0309 14:36:51.589063 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2n85d/must-gather-hmzh8"] Mar 09 14:36:51 crc kubenswrapper[4764]: I0309 14:36:51.589402 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-2n85d/must-gather-hmzh8" podUID="f4069cd4-c4ea-4c35-a8e3-231f40655d27" containerName="copy" containerID="cri-o://7a1d05f1a79149bd8bced16f5f4fb67221a689028074b5bf7fc3e37c2411e2d9" gracePeriod=2 Mar 09 14:36:51 crc kubenswrapper[4764]: I0309 14:36:51.605606 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2n85d/must-gather-hmzh8"] Mar 09 14:36:51 crc kubenswrapper[4764]: I0309 14:36:51.746887 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2n85d_must-gather-hmzh8_f4069cd4-c4ea-4c35-a8e3-231f40655d27/copy/0.log" Mar 09 14:36:51 crc kubenswrapper[4764]: I0309 14:36:51.747766 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4069cd4-c4ea-4c35-a8e3-231f40655d27" containerID="7a1d05f1a79149bd8bced16f5f4fb67221a689028074b5bf7fc3e37c2411e2d9" exitCode=143 Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.049522 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2n85d_must-gather-hmzh8_f4069cd4-c4ea-4c35-a8e3-231f40655d27/copy/0.log" Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.049980 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/must-gather-hmzh8" Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.138124 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f4069cd4-c4ea-4c35-a8e3-231f40655d27-must-gather-output\") pod \"f4069cd4-c4ea-4c35-a8e3-231f40655d27\" (UID: \"f4069cd4-c4ea-4c35-a8e3-231f40655d27\") " Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.138544 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bslj\" (UniqueName: \"kubernetes.io/projected/f4069cd4-c4ea-4c35-a8e3-231f40655d27-kube-api-access-8bslj\") pod \"f4069cd4-c4ea-4c35-a8e3-231f40655d27\" (UID: \"f4069cd4-c4ea-4c35-a8e3-231f40655d27\") " Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.191657 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4069cd4-c4ea-4c35-a8e3-231f40655d27-kube-api-access-8bslj" (OuterVolumeSpecName: "kube-api-access-8bslj") pod "f4069cd4-c4ea-4c35-a8e3-231f40655d27" (UID: "f4069cd4-c4ea-4c35-a8e3-231f40655d27"). InnerVolumeSpecName "kube-api-access-8bslj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.242532 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bslj\" (UniqueName: \"kubernetes.io/projected/f4069cd4-c4ea-4c35-a8e3-231f40655d27-kube-api-access-8bslj\") on node \"crc\" DevicePath \"\"" Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.381446 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4069cd4-c4ea-4c35-a8e3-231f40655d27-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f4069cd4-c4ea-4c35-a8e3-231f40655d27" (UID: "f4069cd4-c4ea-4c35-a8e3-231f40655d27"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.447007 4764 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f4069cd4-c4ea-4c35-a8e3-231f40655d27-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.759014 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2n85d_must-gather-hmzh8_f4069cd4-c4ea-4c35-a8e3-231f40655d27/copy/0.log" Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.759425 4764 scope.go:117] "RemoveContainer" containerID="7a1d05f1a79149bd8bced16f5f4fb67221a689028074b5bf7fc3e37c2411e2d9" Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.759461 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/must-gather-hmzh8" Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.788005 4764 scope.go:117] "RemoveContainer" containerID="3751c9a7f60cedf52e5109a5d05ced7856abcae8f6d88733462034a07970747a" Mar 09 14:36:53 crc kubenswrapper[4764]: I0309 14:36:53.572883 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4069cd4-c4ea-4c35-a8e3-231f40655d27" path="/var/lib/kubelet/pods/f4069cd4-c4ea-4c35-a8e3-231f40655d27/volumes" Mar 09 14:37:03 crc kubenswrapper[4764]: I0309 14:37:03.560772 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:37:03 crc kubenswrapper[4764]: E0309 14:37:03.564012 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:37:13 crc kubenswrapper[4764]: I0309 14:37:13.009509 4764 scope.go:117] "RemoveContainer" containerID="35306a339ac014f00b9490fd62d260c20edfb15b36c977f1b4e50e10596530a4" Mar 09 14:37:18 crc kubenswrapper[4764]: I0309 14:37:18.560670 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:37:18 crc kubenswrapper[4764]: E0309 14:37:18.562994 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:37:33 crc kubenswrapper[4764]: I0309 14:37:33.560855 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:37:33 crc kubenswrapper[4764]: E0309 14:37:33.561925 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:37:45 crc kubenswrapper[4764]: I0309 14:37:45.567181 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:37:45 crc kubenswrapper[4764]: E0309 14:37:45.568313 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:37:58 crc kubenswrapper[4764]: I0309 14:37:58.560193 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:37:58 crc kubenswrapper[4764]: E0309 14:37:58.561279 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.154320 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551118-kjz4g"] Mar 09 14:38:00 crc kubenswrapper[4764]: E0309 14:38:00.155778 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4069cd4-c4ea-4c35-a8e3-231f40655d27" containerName="copy" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.155797 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4069cd4-c4ea-4c35-a8e3-231f40655d27" containerName="copy" Mar 09 14:38:00 crc kubenswrapper[4764]: E0309 14:38:00.155815 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6" containerName="oc" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.155824 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6" containerName="oc" Mar 09 14:38:00 crc kubenswrapper[4764]: E0309 14:38:00.155853 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4069cd4-c4ea-4c35-a8e3-231f40655d27" containerName="gather" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.156001 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4069cd4-c4ea-4c35-a8e3-231f40655d27" containerName="gather" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.156308 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4069cd4-c4ea-4c35-a8e3-231f40655d27" containerName="copy" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.156340 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4069cd4-c4ea-4c35-a8e3-231f40655d27" containerName="gather" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.156363 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6" containerName="oc" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.157396 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551118-kjz4g" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.160255 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.160262 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.161685 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.164076 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551118-kjz4g"] Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.331539 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pjvr\" (UniqueName: \"kubernetes.io/projected/f04448f5-694b-46ae-9599-546c7bbe0c14-kube-api-access-5pjvr\") pod \"auto-csr-approver-29551118-kjz4g\" (UID: \"f04448f5-694b-46ae-9599-546c7bbe0c14\") " pod="openshift-infra/auto-csr-approver-29551118-kjz4g" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.433343 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pjvr\" (UniqueName: \"kubernetes.io/projected/f04448f5-694b-46ae-9599-546c7bbe0c14-kube-api-access-5pjvr\") pod \"auto-csr-approver-29551118-kjz4g\" (UID: \"f04448f5-694b-46ae-9599-546c7bbe0c14\") " pod="openshift-infra/auto-csr-approver-29551118-kjz4g" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.885711 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pjvr\" (UniqueName: \"kubernetes.io/projected/f04448f5-694b-46ae-9599-546c7bbe0c14-kube-api-access-5pjvr\") pod \"auto-csr-approver-29551118-kjz4g\" (UID: \"f04448f5-694b-46ae-9599-546c7bbe0c14\") " pod="openshift-infra/auto-csr-approver-29551118-kjz4g" Mar 09 14:38:01 crc kubenswrapper[4764]: I0309 14:38:01.078703 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551118-kjz4g" Mar 09 14:38:01 crc kubenswrapper[4764]: I0309 14:38:01.575667 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551118-kjz4g"] Mar 09 14:38:02 crc kubenswrapper[4764]: I0309 14:38:02.571554 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551118-kjz4g" event={"ID":"f04448f5-694b-46ae-9599-546c7bbe0c14","Type":"ContainerStarted","Data":"0ed8f0ec55a294187ea8be7f43ebadf8b14bdebd77fa0465ade1a0a2c4234e1a"} Mar 09 14:38:03 crc kubenswrapper[4764]: I0309 14:38:03.584701 4764 generic.go:334] "Generic (PLEG): container finished" podID="f04448f5-694b-46ae-9599-546c7bbe0c14" containerID="71c2dc76a892ba46131f8abbf7343dc29824edff40c7100aa8bae5d80219ca9b" exitCode=0 Mar 09 14:38:03 crc kubenswrapper[4764]: I0309 14:38:03.584815 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551118-kjz4g" event={"ID":"f04448f5-694b-46ae-9599-546c7bbe0c14","Type":"ContainerDied","Data":"71c2dc76a892ba46131f8abbf7343dc29824edff40c7100aa8bae5d80219ca9b"} Mar 09 14:38:04 crc kubenswrapper[4764]: I0309 14:38:04.998012 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551118-kjz4g" Mar 09 14:38:05 crc kubenswrapper[4764]: I0309 14:38:05.151512 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pjvr\" (UniqueName: \"kubernetes.io/projected/f04448f5-694b-46ae-9599-546c7bbe0c14-kube-api-access-5pjvr\") pod \"f04448f5-694b-46ae-9599-546c7bbe0c14\" (UID: \"f04448f5-694b-46ae-9599-546c7bbe0c14\") " Mar 09 14:38:05 crc kubenswrapper[4764]: I0309 14:38:05.161046 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f04448f5-694b-46ae-9599-546c7bbe0c14-kube-api-access-5pjvr" (OuterVolumeSpecName: "kube-api-access-5pjvr") pod "f04448f5-694b-46ae-9599-546c7bbe0c14" (UID: "f04448f5-694b-46ae-9599-546c7bbe0c14"). InnerVolumeSpecName "kube-api-access-5pjvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:38:05 crc kubenswrapper[4764]: I0309 14:38:05.255157 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pjvr\" (UniqueName: \"kubernetes.io/projected/f04448f5-694b-46ae-9599-546c7bbe0c14-kube-api-access-5pjvr\") on node \"crc\" DevicePath \"\"" Mar 09 14:38:05 crc kubenswrapper[4764]: I0309 14:38:05.613432 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551118-kjz4g" event={"ID":"f04448f5-694b-46ae-9599-546c7bbe0c14","Type":"ContainerDied","Data":"0ed8f0ec55a294187ea8be7f43ebadf8b14bdebd77fa0465ade1a0a2c4234e1a"} Mar 09 14:38:05 crc kubenswrapper[4764]: I0309 14:38:05.614113 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ed8f0ec55a294187ea8be7f43ebadf8b14bdebd77fa0465ade1a0a2c4234e1a" Mar 09 14:38:05 crc kubenswrapper[4764]: I0309 14:38:05.613799 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551118-kjz4g" Mar 09 14:38:06 crc kubenswrapper[4764]: I0309 14:38:06.075067 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551112-lcwnw"] Mar 09 14:38:06 crc kubenswrapper[4764]: I0309 14:38:06.084590 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551112-lcwnw"] Mar 09 14:38:07 crc kubenswrapper[4764]: I0309 14:38:07.574412 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3078e21d-b42c-45f0-94c0-d3980ec27f1f" path="/var/lib/kubelet/pods/3078e21d-b42c-45f0-94c0-d3980ec27f1f/volumes" Mar 09 14:38:12 crc kubenswrapper[4764]: I0309 14:38:12.560132 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:38:12 crc kubenswrapper[4764]: E0309 14:38:12.561372 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:38:13 crc kubenswrapper[4764]: I0309 14:38:13.142054 4764 scope.go:117] "RemoveContainer" containerID="6874ccb9508c9920d5940aa912a69fc71adf07efd3d1ffdbd80a9373286c8c70" Mar 09 14:38:25 crc kubenswrapper[4764]: I0309 14:38:25.567108 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:38:25 crc kubenswrapper[4764]: E0309 14:38:25.568486 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:38:38 crc kubenswrapper[4764]: I0309 14:38:38.559669 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:38:38 crc kubenswrapper[4764]: I0309 14:38:38.953859 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"ad419096bfe6f5c82a80213c43472ce393de4b635e5b73b0e84a78e63ff05edb"} Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.163567 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551120-m5thp"] Mar 09 14:40:00 crc kubenswrapper[4764]: E0309 14:40:00.165105 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04448f5-694b-46ae-9599-546c7bbe0c14" containerName="oc" Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.165125 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04448f5-694b-46ae-9599-546c7bbe0c14" containerName="oc" Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.165490 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f04448f5-694b-46ae-9599-546c7bbe0c14" containerName="oc" Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.166516 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551120-m5thp" Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.170019 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.170095 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.170394 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.189390 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551120-m5thp"] Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.269497 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh8gd\" (UniqueName: \"kubernetes.io/projected/0816f954-d7d8-485c-80c2-f37396ccc846-kube-api-access-xh8gd\") pod \"auto-csr-approver-29551120-m5thp\" (UID: \"0816f954-d7d8-485c-80c2-f37396ccc846\") " pod="openshift-infra/auto-csr-approver-29551120-m5thp" Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.371719 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh8gd\" (UniqueName: \"kubernetes.io/projected/0816f954-d7d8-485c-80c2-f37396ccc846-kube-api-access-xh8gd\") pod \"auto-csr-approver-29551120-m5thp\" (UID: \"0816f954-d7d8-485c-80c2-f37396ccc846\") " pod="openshift-infra/auto-csr-approver-29551120-m5thp" Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.393523 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh8gd\" (UniqueName: \"kubernetes.io/projected/0816f954-d7d8-485c-80c2-f37396ccc846-kube-api-access-xh8gd\") pod \"auto-csr-approver-29551120-m5thp\" (UID: \"0816f954-d7d8-485c-80c2-f37396ccc846\") " pod="openshift-infra/auto-csr-approver-29551120-m5thp" Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.491735 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551120-m5thp" Mar 09 14:40:01 crc kubenswrapper[4764]: I0309 14:40:01.006936 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551120-m5thp"] Mar 09 14:40:01 crc kubenswrapper[4764]: I0309 14:40:01.829778 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551120-m5thp" event={"ID":"0816f954-d7d8-485c-80c2-f37396ccc846","Type":"ContainerStarted","Data":"418d95bf9922ee6c68fc6ee54bf92d132a0fa80cbe02cb056aa7e9af26613f17"} Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.454368 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fwqd6"] Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.457178 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.465447 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fwqd6"] Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.531152 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-utilities\") pod \"certified-operators-fwqd6\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.531528 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-catalog-content\") pod \"certified-operators-fwqd6\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.531750 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6dqh\" (UniqueName: \"kubernetes.io/projected/a9f449aa-58d0-4541-b02d-f7240113d330-kube-api-access-g6dqh\") pod \"certified-operators-fwqd6\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.634052 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-utilities\") pod \"certified-operators-fwqd6\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.634120 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-catalog-content\") pod \"certified-operators-fwqd6\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.634153 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6dqh\" (UniqueName: \"kubernetes.io/projected/a9f449aa-58d0-4541-b02d-f7240113d330-kube-api-access-g6dqh\") pod \"certified-operators-fwqd6\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.635508 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-catalog-content\") pod \"certified-operators-fwqd6\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.636097 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-utilities\") pod \"certified-operators-fwqd6\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.665495 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6dqh\" (UniqueName: \"kubernetes.io/projected/a9f449aa-58d0-4541-b02d-f7240113d330-kube-api-access-g6dqh\") pod \"certified-operators-fwqd6\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.817168 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.848869 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551120-m5thp" event={"ID":"0816f954-d7d8-485c-80c2-f37396ccc846","Type":"ContainerStarted","Data":"47825136b03ce00e9b8dbc1f9567349d8325a39be5d441fbdcc3a705bf988cac"} Mar 09 14:40:03 crc kubenswrapper[4764]: I0309 14:40:03.165415 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551120-m5thp" podStartSLOduration=1.892993332 podStartE2EDuration="3.165391616s" podCreationTimestamp="2026-03-09 14:40:00 +0000 UTC" firstStartedPulling="2026-03-09 14:40:01.014022091 +0000 UTC m=+4756.264194159" lastFinishedPulling="2026-03-09 14:40:02.286420535 +0000 UTC m=+4757.536592443" observedRunningTime="2026-03-09 14:40:02.874172296 +0000 UTC m=+4758.124344204" watchObservedRunningTime="2026-03-09 14:40:03.165391616 +0000 UTC m=+4758.415563544" Mar 09 14:40:03 crc kubenswrapper[4764]: I0309 14:40:03.173788 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fwqd6"] Mar 09 14:40:03 crc kubenswrapper[4764]: W0309 14:40:03.191454 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9f449aa_58d0_4541_b02d_f7240113d330.slice/crio-c486688c64fb082ba7d8532410236dfd36e9116f3c410bc3d5d352e3a5c715f8 WatchSource:0}: Error finding container c486688c64fb082ba7d8532410236dfd36e9116f3c410bc3d5d352e3a5c715f8: Status 404 returned error can't find the container with id c486688c64fb082ba7d8532410236dfd36e9116f3c410bc3d5d352e3a5c715f8 Mar 09 14:40:03 crc kubenswrapper[4764]: I0309 14:40:03.862198 4764 generic.go:334] "Generic (PLEG): container finished" podID="0816f954-d7d8-485c-80c2-f37396ccc846" containerID="47825136b03ce00e9b8dbc1f9567349d8325a39be5d441fbdcc3a705bf988cac" exitCode=0 Mar 09 14:40:03 crc kubenswrapper[4764]: I0309 14:40:03.862325 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551120-m5thp" event={"ID":"0816f954-d7d8-485c-80c2-f37396ccc846","Type":"ContainerDied","Data":"47825136b03ce00e9b8dbc1f9567349d8325a39be5d441fbdcc3a705bf988cac"} Mar 09 14:40:03 crc kubenswrapper[4764]: I0309 14:40:03.866765 4764 generic.go:334] "Generic (PLEG): container finished" podID="a9f449aa-58d0-4541-b02d-f7240113d330" containerID="25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594" exitCode=0 Mar 09 14:40:03 crc kubenswrapper[4764]: I0309 14:40:03.866801 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwqd6" event={"ID":"a9f449aa-58d0-4541-b02d-f7240113d330","Type":"ContainerDied","Data":"25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594"} Mar 09 14:40:03 crc kubenswrapper[4764]: I0309 14:40:03.866825 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwqd6" event={"ID":"a9f449aa-58d0-4541-b02d-f7240113d330","Type":"ContainerStarted","Data":"c486688c64fb082ba7d8532410236dfd36e9116f3c410bc3d5d352e3a5c715f8"} Mar 09 14:40:05 crc kubenswrapper[4764]: I0309 14:40:05.328764 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551120-m5thp" Mar 09 14:40:05 crc kubenswrapper[4764]: I0309 14:40:05.421948 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh8gd\" (UniqueName: \"kubernetes.io/projected/0816f954-d7d8-485c-80c2-f37396ccc846-kube-api-access-xh8gd\") pod \"0816f954-d7d8-485c-80c2-f37396ccc846\" (UID: \"0816f954-d7d8-485c-80c2-f37396ccc846\") " Mar 09 14:40:05 crc kubenswrapper[4764]: I0309 14:40:05.430105 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0816f954-d7d8-485c-80c2-f37396ccc846-kube-api-access-xh8gd" (OuterVolumeSpecName: "kube-api-access-xh8gd") pod "0816f954-d7d8-485c-80c2-f37396ccc846" (UID: "0816f954-d7d8-485c-80c2-f37396ccc846"). InnerVolumeSpecName "kube-api-access-xh8gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:40:05 crc kubenswrapper[4764]: I0309 14:40:05.524818 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh8gd\" (UniqueName: \"kubernetes.io/projected/0816f954-d7d8-485c-80c2-f37396ccc846-kube-api-access-xh8gd\") on node \"crc\" DevicePath \"\"" Mar 09 14:40:05 crc kubenswrapper[4764]: I0309 14:40:05.894306 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwqd6" event={"ID":"a9f449aa-58d0-4541-b02d-f7240113d330","Type":"ContainerStarted","Data":"72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8"} Mar 09 14:40:05 crc kubenswrapper[4764]: I0309 14:40:05.897725 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551120-m5thp" event={"ID":"0816f954-d7d8-485c-80c2-f37396ccc846","Type":"ContainerDied","Data":"418d95bf9922ee6c68fc6ee54bf92d132a0fa80cbe02cb056aa7e9af26613f17"} Mar 09 14:40:05 crc kubenswrapper[4764]: I0309 14:40:05.897798 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="418d95bf9922ee6c68fc6ee54bf92d132a0fa80cbe02cb056aa7e9af26613f17" Mar 09 14:40:05 crc kubenswrapper[4764]: I0309 14:40:05.897847 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551120-m5thp" Mar 09 14:40:05 crc kubenswrapper[4764]: I0309 14:40:05.983088 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551114-9w2wl"] Mar 09 14:40:05 crc kubenswrapper[4764]: I0309 14:40:05.996293 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551114-9w2wl"] Mar 09 14:40:06 crc kubenswrapper[4764]: I0309 14:40:06.911196 4764 generic.go:334] "Generic (PLEG): container finished" podID="a9f449aa-58d0-4541-b02d-f7240113d330" containerID="72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8" exitCode=0 Mar 09 14:40:06 crc kubenswrapper[4764]: I0309 14:40:06.911256 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwqd6" event={"ID":"a9f449aa-58d0-4541-b02d-f7240113d330","Type":"ContainerDied","Data":"72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8"} Mar 09 14:40:07 crc kubenswrapper[4764]: I0309 14:40:07.577776 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b063d56-a0eb-4b2d-8b53-2c63feead99e" path="/var/lib/kubelet/pods/3b063d56-a0eb-4b2d-8b53-2c63feead99e/volumes" Mar 09 14:40:07 crc kubenswrapper[4764]: I0309 14:40:07.928983 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwqd6" event={"ID":"a9f449aa-58d0-4541-b02d-f7240113d330","Type":"ContainerStarted","Data":"974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155"} Mar 09 14:40:07 crc kubenswrapper[4764]: I0309 14:40:07.965130 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fwqd6" podStartSLOduration=2.443424111 podStartE2EDuration="5.965101926s" podCreationTimestamp="2026-03-09 14:40:02 +0000 UTC" firstStartedPulling="2026-03-09 14:40:03.869787755 +0000 UTC m=+4759.119959673" lastFinishedPulling="2026-03-09 14:40:07.39146558 +0000 UTC m=+4762.641637488" observedRunningTime="2026-03-09 14:40:07.949386146 +0000 UTC m=+4763.199558054" watchObservedRunningTime="2026-03-09 14:40:07.965101926 +0000 UTC m=+4763.215273834" Mar 09 14:40:12 crc kubenswrapper[4764]: I0309 14:40:12.818876 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:12 crc kubenswrapper[4764]: I0309 14:40:12.819990 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:12 crc kubenswrapper[4764]: I0309 14:40:12.880691 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:13 crc kubenswrapper[4764]: I0309 14:40:13.036043 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:13 crc kubenswrapper[4764]: I0309 14:40:13.125407 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fwqd6"] Mar 09 14:40:13 crc kubenswrapper[4764]: I0309 14:40:13.269340 4764 scope.go:117] "RemoveContainer" containerID="ea38c17635079a211c8b307953c91fb9e37a06f4db03c9c46e225e504f02afec" Mar 09 14:40:15 crc kubenswrapper[4764]: I0309 14:40:15.003705 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fwqd6" podUID="a9f449aa-58d0-4541-b02d-f7240113d330" containerName="registry-server" containerID="cri-o://974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155" gracePeriod=2 Mar 09 14:40:15 crc kubenswrapper[4764]: I0309 14:40:15.474877 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:15 crc kubenswrapper[4764]: I0309 14:40:15.584796 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-catalog-content\") pod \"a9f449aa-58d0-4541-b02d-f7240113d330\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " Mar 09 14:40:15 crc kubenswrapper[4764]: I0309 14:40:15.585070 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-utilities\") pod \"a9f449aa-58d0-4541-b02d-f7240113d330\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " Mar 09 14:40:15 crc kubenswrapper[4764]: I0309 14:40:15.585188 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6dqh\" (UniqueName: \"kubernetes.io/projected/a9f449aa-58d0-4541-b02d-f7240113d330-kube-api-access-g6dqh\") pod \"a9f449aa-58d0-4541-b02d-f7240113d330\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " Mar 09 14:40:15 crc kubenswrapper[4764]: I0309 14:40:15.587989 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-utilities" (OuterVolumeSpecName: "utilities") pod "a9f449aa-58d0-4541-b02d-f7240113d330" (UID: "a9f449aa-58d0-4541-b02d-f7240113d330"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:40:15 crc kubenswrapper[4764]: I0309 14:40:15.595024 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9f449aa-58d0-4541-b02d-f7240113d330-kube-api-access-g6dqh" (OuterVolumeSpecName: "kube-api-access-g6dqh") pod "a9f449aa-58d0-4541-b02d-f7240113d330" (UID: "a9f449aa-58d0-4541-b02d-f7240113d330"). InnerVolumeSpecName "kube-api-access-g6dqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:40:15 crc kubenswrapper[4764]: I0309 14:40:15.687331 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:40:15 crc kubenswrapper[4764]: I0309 14:40:15.687369 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6dqh\" (UniqueName: \"kubernetes.io/projected/a9f449aa-58d0-4541-b02d-f7240113d330-kube-api-access-g6dqh\") on node \"crc\" DevicePath \"\"" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.020940 4764 generic.go:334] "Generic (PLEG): container finished" podID="a9f449aa-58d0-4541-b02d-f7240113d330" containerID="974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155" exitCode=0 Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.021039 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.021054 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwqd6" event={"ID":"a9f449aa-58d0-4541-b02d-f7240113d330","Type":"ContainerDied","Data":"974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155"} Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.021750 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwqd6" event={"ID":"a9f449aa-58d0-4541-b02d-f7240113d330","Type":"ContainerDied","Data":"c486688c64fb082ba7d8532410236dfd36e9116f3c410bc3d5d352e3a5c715f8"} Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.021800 4764 scope.go:117] "RemoveContainer" containerID="974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.050501 4764 scope.go:117] "RemoveContainer" containerID="72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.081994 4764 scope.go:117] "RemoveContainer" containerID="25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.134440 4764 scope.go:117] "RemoveContainer" containerID="974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155" Mar 09 14:40:16 crc kubenswrapper[4764]: E0309 14:40:16.135463 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155\": container with ID starting with 974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155 not found: ID does not exist" containerID="974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.135501 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155"} err="failed to get container status \"974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155\": rpc error: code = NotFound desc = could not find container \"974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155\": container with ID starting with 974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155 not found: ID does not exist" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.135528 4764 scope.go:117] "RemoveContainer" containerID="72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8" Mar 09 14:40:16 crc kubenswrapper[4764]: E0309 14:40:16.136159 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8\": container with ID starting with 72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8 not found: ID does not exist" containerID="72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.136215 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8"} err="failed to get container status \"72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8\": rpc error: code = NotFound desc = could not find container \"72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8\": container with ID starting with 72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8 not found: ID does not exist" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.136252 4764 scope.go:117] "RemoveContainer" containerID="25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594" Mar 09 14:40:16 crc kubenswrapper[4764]: E0309 14:40:16.136580 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594\": container with ID starting with 25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594 not found: ID does not exist" containerID="25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.136609 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594"} err="failed to get container status \"25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594\": rpc error: code = NotFound desc = could not find container \"25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594\": container with ID starting with 25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594 not found: ID does not exist" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.518109 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9f449aa-58d0-4541-b02d-f7240113d330" (UID: "a9f449aa-58d0-4541-b02d-f7240113d330"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.609543 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.661495 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fwqd6"] Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.671554 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fwqd6"] Mar 09 14:40:17 crc kubenswrapper[4764]: I0309 14:40:17.572541 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9f449aa-58d0-4541-b02d-f7240113d330" path="/var/lib/kubelet/pods/a9f449aa-58d0-4541-b02d-f7240113d330/volumes" Mar 09 14:40:36 crc kubenswrapper[4764]: I0309 14:40:36.786191 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="0c87ed75-4285-4084-bce3-ee8dba7671c0" containerName="galera" probeResult="failure" output="command timed out" Mar 09 14:40:36 crc kubenswrapper[4764]: I0309 14:40:36.787555 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="0c87ed75-4285-4084-bce3-ee8dba7671c0" containerName="galera" probeResult="failure" output="command timed out" Mar 09 14:40:58 crc kubenswrapper[4764]: I0309 14:40:58.371056 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:40:58 crc kubenswrapper[4764]: I0309 14:40:58.371666 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"